I work on a database that is frequently changed by external processes. I cannot control when or how those other processes work & they often temporarily delete rows or update values to zero and rewrite them. As a result my queries are sometimes wrong as a result of querying at the wrong time.
My workaround is to query criteria about the table in a loop & if all of the key criteria are consistent for one minute, then the data is safe to export.
My question is, are there side effects from the performance aspect? I assume that the query is only blocking insofar as it's individual count & as it is waiting to query again, the table is unblocked. Is that so? Would you do this differently?
DECLARE @count INT;
DECLARE @previous_total INT;
SET @count = 1;
SET @previous_total = 0;
WHILE @count<= 6
BEGIN
IF( SELECT SUM( [MyCol] ) FROM [MyDb].[dbo].[MyTable] WHERE 1=1 AND 2=2 AND 3=3 ) = @previous_total
BEGIN
SET @count = 999
END
ELSE
BEGIN
SET @previous_total = ( SELECT SUM( [MyCol] ) FROM [MyDb].[dbo].[MyTable] WHERE 1=1 AND 2=2 AND 3=3 )
END
PRINT @count
SET @count = @count + 1;
WAITFOR DELAY '00:00:10'
END;