Tuning a SELECT Col1, Col2,... FROM Big_Table and attempts throw "out of memory exception" error in SSMS since Big_Table is 10GB and SSMS only handles 2GB it's a problem.
Trouble is I have forgotten the 'method' for running the query with SET STATISTICS IO ON and not FMT_ONLY or from the CMD option.
Can anyone please remind me the method to execute a large query that still captures the plan, does not produce result and does display a row count.
Thanks in advance
Do I understand correctly that you're returning a 10GB table to the screen in SSMS? If so, my question would be.... WHY??? Who's going to read all that? What are you really trying to do with the output? Store it to a file?
Yes, you could display the estimated query plan (for which the short-cut command is Ctrl+L).
You could also post the query and all table DDL, including index(es), and we can more directly help you tune.
The best clustered index is critical to tuning.
Thanks guys but none of those.
Not intending to return 10GB of data, but the volume is valid, just not achievable via SSMS.
I prefer not to use estimate plans, only actual.
You can run a tsql and collect an execution plan without returning the result set, or dumping the results into a file/table. SQL Sentry Plan Explorer does this.
From memory my solution was a WITH CTE which included an @@rowcount wrapped around the SELECT. This was sql runs the query, displays a row count but will not attempt to visualize the result into SSMS which we know will blow at 2GB. Will find TSQL and reply to this post.
Appreciate everyone's thoughts
You can tell SSMS to discard results but the underlying query can still cause an out of memory condition if there's an accidental many to many join.