How to optimize the data in the table which has HUGE data records

Currently we have 74 crores of records in the table, when we refer that table in our Stored Procedure, it is taking more time. Kindly suggest your thoughts

Row count by itself is not a determining factor. 74 crore = 740 million rows can be stored very economically depending on the storage method. And even in a row store format, proper indexing can improve query performance.

For us to help you we will need to know:

  1. The structure of the table, posted as a CREATE TABLE statement with all columns and their respective data types. This should also include all the indexes on that table.

  2. The code of the stored procedure, or at the very least, the statement(s) that are having the worst performance that you want to tune. This needs to include any stored procedure parameters that are used by that statement.

  3. Some sample data, and the expected results from that sample. Obviously we don't want 740M rows of data, but 10-20 rows that gives us some impression of what it looks like.

  4. If you can provide the query plan, that would be most beneficial. You can use the following site to post the XML of the query plan: Paste The Plan - Brent Ozar Unlimited®

Please also read the following and try to post the request information accordingly: