Hey everyone,
I have been using T SQL & I am facing difficulties when it comes to managing sizable data sets. Also I am trying to optimize some queries & improve performance but even with indexing and query tuning; things seem to slow down when I am processing large volumes of data.
I want to get some advice from anyone who is worked with large scale databases in T-SQL. I want to know best practices for managing memory, optimizing joins or any strategies for breaking down queries into more manageable chunks. I have also seen some mentions of using temp tables and table variables in certain scenarios does anyone have insight into when to use one over the other??
If anyone have Any suggestions, tips or even examples would be greatly appreciated
Thanks in advance
Have you analyzed the execution plans of most expensive queries?
Most of the time, the issue is not with the volume of data, but the way it is being retrieved matters.
My recommendation would be to look at Brent Ozar's classes.
And, no.... I'm not associated with Brent nor do I get any kickback for me bringing his courses up.
@ahmeds08 I have not analyzed the execution plans in detail yet but that is definitely something I will start doing. I have heard that inefficient joins or missing indexes can cause major slowdowns, even with relatively small datasets.
Thank you for the suggestion.
Hey @JeffModen Thank you for recommendation. I have heard great things about Brent Ozar’s training and classes when it comes to performance tuning. I will definitely check them out.