Best way to reduce workload of queries (when memory is an issue)


I'm working with a very large database with hundrends of millions of records. Often I have to run complex queries that can take 30mins-2hours+ to finish executing by virtue of there being so many records.

Problem: Sometimes memory runs out and the execution fails. How can I stop this from happening?

I know indexes are meant for this kind of situation, but are there alternatives? Because sometimes the operations I perform wouldn't be helped by a column index.

I was thinking something like execute the operation on a limit of 50,000 rows, capture and save my results, and then do the same thing on the next 50,000 rows. But I'm not sure of the best way to do that.


  1. Improve the query plans.
  2. Create summary tables and populate them overnight.
  3. Create a cube and load overnight.
    etc, etc...

How long is a piece of string?

1 Like


i do that frequently

as part of performance tuning
if i have 10 million i break it up into small bits
but one thing i have noticed is there is a SWEET SPOT ... the magic small bit number
where performance is best
what that magic small bit number is ... how do you figure it out is another story

hope this helps :slight_smile: :slight_smile:

1 Like