I placed a SELECT statement inside of a loop and it runs fine until 10,000 loops which consistently crashed SSMS. Of course, this is probably a RAM issue and probably useless of placing a SELECT statement inside of a loop which does not really display the result until the end of the loop.
Just out of curiosity, is it possible to clear the cache or whatever to prevent the crash? I have tried:
DBCC FREESYSTEMCACHE ('ALL')
Note that clearing the cache will create new query plans that have been previously cached.
The problem is not with SQL Server - the problem is with SSMS on your client. Clearing the cache in SQL Server will not have any impact on the memory utilized by your client - on your workstation.
When you run a query in SSMS - the output is stored in temporary files on your workstation, and SSMS will utilize memory on your client to manage those results. Since you are selecting multiple result sets - it will eventually run out of memory and crash.
Thanks Ahmed and Jeff. So Jeff, there is no get around of it? I have learned not to put SELECT inside loops anymore.
What are you trying to accomplish? I don't see any value to outputting 10,000+ results to SSMS and being able to actually utilize that data for anything other than quick validation.
You can actually cause the same issue by outputting hundreds of millions of rows from a single query. At some point you exhaust the clients system resources - and you cannot actually review that many rows in SSMS.
I can agree with your last post Jeff because if I really need to know where the loop in the cycle, I can have it log to a log table.