I have a 3rd party script which is using sp_updatestats inside a loop (during a large data import and subsequent, somewhat massive!, updates of the data according to workflow rules etc).
I don't have a problem with that, but a) it's taking some time and b) its outputting huge amounts of
[_WA_Sys_MyColumn_05A3D694], update is not necessary...
wibble, which is polluting the output and making spotting any errors etc. tricky and c) the sp_updatestats runs even if the number of rows changed in that loop iteration is tiny, or zero even
SQL DOCs says "For disk-based tables, sp_updatestats updates only the statistics that require updating based on the modification_counter information in the sys.dm_db_stats_properties catalog view, thus avoiding unnecessary updates of statistics on unchanged rows."
Doe that mean that using sp_updatestats in this way is NOT doing anything that the routine dynamic update of statistics wouldn't also do? or is sp_updatestats using a lower threshold for the modification_counter perhaps?
Could I just rely on the system rebuilding stats dynamic when it feels the need? I'm sure the Vendor put this in there because of a bad experience at some time!
Could try to just use UPDATE STATISTICS on specific table(s), but in this scenario its pretty difficult to know which tables might have had massive numbers of rows inserted (and be future-proof in that regard) so I can see why the 3rd party took this "update everything" approach
One thought I had was to create a job that runs sp_updatestats and just have the script launch that job, so it runs in parallel (maybe there is another way to achieve that?) so that the output is not polluting the primary job.