Hey,
I'm working with a very large database with hundrends of millions of records. Often I have to run complex queries that can take 30mins-2hours+ to finish executing by virtue of there being so many records.
Problem: Sometimes memory runs out and the execution fails. How can I stop this from happening?
I know indexes are meant for this kind of situation, but are there alternatives? Because sometimes the operations I perform wouldn't be helped by a column index.
I was thinking something like execute the operation on a limit of 50,000 rows, capture and save my results, and then do the same thing on the next 50,000 rows. But I'm not sure of the best way to do that.
Ideas?