SQLTeam.com | Weblogs | Forums

Problem with stored procedure



I have faced with a problem whereby if i execute the stored procedure like :
exec [dbo].[PU_ShippingImportGoods.SP_Report] @CompanyId=1,@DateFrom='2017-06-01 00:00:00',@DateTo='2017-06-30 00:00:00',@StartDate='2009-01-01 00:00:00',@WarehouseInquery=NULL (If null retrieve all warehouse)

it took me more than 20 mins to run this code

where else if i take the inner code of the stored procedure and run by declaring the variables it takes me about 20 seconds.

Is there any way that i could trouble shoot what is wrong or theres nothing wrong at all?


Couple of thoughts:

Recompile the Sproc - in case it is using a stale query plan. Might be that the query plan, created using the first execution of the Sproc, was for an unrealistic parameter set - but you are then stuck with that query plan for subsequent executions.

Maybe update statistics - I like to do that before any performance test so that isn't a factor. That said, you then run the risk that the Sproc runs fine, solely because it creates a new query plan, and you have not actually solved the problem.

Other possibility is the effect known as "parameter sniffing"


I am trying my very best to understand but still lost as I have barely started with SQL Server for a year.

//I tried SP_recompile 'storeprocname' to no avail

I hope you would enlighten me with statistics as I am using Express i tried using analyze query and it says it isnt supported?

and what can be done if it is parameter sniffing?


well i hope this is the right statistics though i have some problem understanding it =X

EXEC sp_updatestats

will update all statistics (all indexes, and any non-index statistics) in the current database. It may take a while!

Beware that it will use default sampling, but assuming that you don't have some active management of statistics already then that's probably "good enough". It will tell you which ones were updated, so if "None of them" that was clearly not the problem! if "lots of them" then you may notice performance improvements in other places too - and would be a good idea to schedule Update Stats periodically (Poor's Man's Version: all stats, each weekend, Posh Version: update all statistics overnight in maintenance window, targeting the ones that are most out-of-shape first, if maintenance window closes before completed then the most important will have been done first; similar thing with Index Rebuild Maintenance)

If update statistics and then recompile doesn't fix anything I think Parameter Sniffing is the only thing left.

There are various tricks such as copying the SProc @Parameters to local working @variables, but probably best to Google that a bit to see what work-arounds you think might work for you


Thank you!!

sp_updatestats worked really well!!

Sorry for the late reply as it was holiday over here for the past 4 days :smile:


Worth checking if you have a Job that updates statistics - seems unlikely given that your manual update made a difference to performance.

If not then a housekeeping maintenance job needs setting up to update statistics regularly.


Ahhh so this needs to be executed regularly like once a week or once a month?

Actually i read through about the sp_updatestats that it updates the indexes but how does it actually able to improvement to such a state where my stored procedure now completes in 20s rather than 20+mins?

At a really lost state on how it works.


When you say "SELECT * FROM MyTable WHERE FooBar > 20" then SQL estimates how many rows will be in the answer and which index will give the most selectivity. It makes that estimate based on the Statistics (which say for INDEX1 "There are 1,000 rows from 1-10, 1200 from 10-20", and so on

If you add / delete, or Update on an index column, lots of rows then the statistics are out of date.

Well ... it depends. I take the view that waiting until Saturday night means that on Friday all our expensive fee earners, company wide, are having to add a few more MS to everything they query, compared to Monday.

So I want to rebuild indexes and statistics every night. But that adds load to the server, and Log data to me LDF files (and TLog backups ...) ...

... so I use a reasonably sophisticated approach to only updating indexes / statistics where necessary, but running that every night. Some heavily used indexes get rebuilt every night, some less often/not at all, but if
"tomorrow" something dramatic happens to the data I know that table will get rebuilt the following night.

If your database is not too big you could just "throw" a bog standard maintenance plan at it. But beware massive log files ... assuming that you have database in FULL Recovery Model (if not, and you are OLTP, WHY NOT? :frowning: )

I think Minion Backup does index rebuild and statistics update. Its a pretty lightweight install and out-of-the-box has sensible defaults (which you can then tinker with if you need to). But you may already have something robust in place for your backups, in which case what does that have for Index Rebuild and Update Statistics?


Base on this, is the statistics like a histogram? Assuming i have entered 10 datas of 1, 9999, 54, 500, 100, 700, 3000, 7000, 5000, 1000
Then the statistics will have something like 1~2000 (6 counts), 2001~4000 (1 count), 4001~6000(1 count), 6001~8000(1 count), 8001~10000(1 count)?


Histogram = yes. I can't remember how it works, so if you are curious best to Google that - doing it from my memory will lead you astray!

I am doubting that it is how you have shown your counts, and more likely to be "First 1,000 gets you to ABC, second 1,000 gets you to DEF, ..." but of course it is laid out for how the stats WERE at the last rebuld, which may be different to how the index is now :slight_smile: although the index, until rebuilt, may will still have the same structure, possibly with completely empty pages and lots of new split-pages (over simplification, but I am sure you get the picture)

Anyway, someone on Google will have done a nice fancy Infographic which will explain it far better than my old, broken!, memory.


Thank you, I think I got the gist of it!