• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

SQL Server question

KLin

Lifer
Say there are thousands of tables in a db, and 5 of those tables contain a huge amount of rows each(close to 100 million rows). The data is only changed 4 times per year when new updates are released. Tables are only used for looking up data by an application.

Is it necessary to be updating the statistics on each of these tables on a weekly basis?
 
absolutely not. i know oracle, not mssql, but in oracle statistics are granular. they're per segment which would allow the advantage (assuming said table was partitioned) to only gather stats on the most recent partition.

the database is smart to enough to know when the statistics are stale (in oracle this is when more then 10% of the table has changed relative).

if statistics aren't stale, there is no reason to touch them. sometimes, gathering stats can do more harm then good.
 
Thanks. There is a sql job that updates stats and does some indexing stuff on a weekly basis. It takes between 9 and 10 hours to run. Half the time is spent updating statistics on these few select tables.
 
Yep, not reason to update them. On top of that, if you are using SQL Server 2008 (or 2005 enterprise edition) and up, you should seriously consider using Page level table compression. That data is a prime candidate for it and you should see some decent read performance gains.
 
I tested turning on page level compression on the largest of the tables on our report server. Shrunk the index size by 5 gigs.
 
Tested the maintenance job on our reporting server, and the job just took 3 hours and 45 minutes by keeping these tables from updating statistics. w00t!
 
Honestly, SQL Server manages stats automatically. It's not perfect, but unless you are seeing performance degradation between maintenance windows then I would cut back. Sometimes DBA's feel that a formulaic approach is how you handle all installations when that's the farthest from the truth. For your scenario, I would consider running backups, doing the quarterly update, then the maintenance plan to check db, rebuild stats and rebuild indexes followed by another backup.
 
Thanks for the input. This is a vendor db by the way. I've noticed they use index heaps on a lot of tables and the heap fragmentation on a lot of the tables are in the high 90's percentage wise.

Bad, yes?
 
Have been in several situations with large mission critical data bases where we used to BCP or backup the large tables out to another disk or tape, drop the table and indexes, redifine the and restore them or BCP them back in. Sometime much faster that update stats.
 
Back
Top