Performance Tuning

As a DBA what we should do if a request will come about poor performance
of the query?

First, ask questions:

  1. What is the current runtime/io of the query? (get a baseline)
  2. Has the runtime/io changed recently?
  3. What is the desired (target) performance in milliseconds. (note: "as fast as possible" is not measurable)
  4. Is the performance hit the target at some times (of the day/ week/ month /year) and only poor at specific times?

Some things to do:

  1. Look at the execution plan. Especially, look for table/index scans and poor row estimates
  2. Update statistics on the tables involved
  3. Consider whether a new index or two might change the scans to seeks
  4. Pick two or three options, try them all and compare to the current runtime/io
  5. Pick the best result

Note that it may be possible that the query is well-written, the indexing is appropriate and the statistics are up to date. Then you may have to start thinking about things like:

  1. IO contention. What else is on the filegroups hosting the tables involved? Is something else hitting those disks particularly hard?
  2. Is tempdb or the log files on the same disk (move them off)
  3. Memory. Is the server memory starved? Perfmon can help you there
  4. Do you need faster disks? Additional NICs?
1 Like

In addition to what gbritton said, I would add Wait Stats. What is the query waiting on most of the time? What is the instance waiting on most of the time? I would also look STATISTICS IO and TIME output. Code refactoring too.