SQL Server question

KLin

Lifer
Feb 29, 2000
30,441
752
126
Say there are thousands of tables in a db, and 5 of those tables contain a huge amount of rows each(close to 100 million rows). The data is only changed 4 times per year when new updates are released. Tables are only used for looking up data by an application.

Is it necessary to be updating the statistics on each of these tables on a weekly basis?
 

ForumMaster

Diamond Member
Feb 24, 2005
7,792
1
0
absolutely not. i know oracle, not mssql, but in oracle statistics are granular. they're per segment which would allow the advantage (assuming said table was partitioned) to only gather stats on the most recent partition.

the database is smart to enough to know when the statistics are stale (in oracle this is when more then 10% of the table has changed relative).

if statistics aren't stale, there is no reason to touch them. sometimes, gathering stats can do more harm then good.
 

KLin

Lifer
Feb 29, 2000
30,441
752
126
Thanks. There is a sql job that updates stats and does some indexing stuff on a weekly basis. It takes between 9 and 10 hours to run. Half the time is spent updating statistics on these few select tables.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
Yep, not reason to update them. On top of that, if you are using SQL Server 2008 (or 2005 enterprise edition) and up, you should seriously consider using Page level table compression. That data is a prime candidate for it and you should see some decent read performance gains.
 

KLin

Lifer
Feb 29, 2000
30,441
752
126
I tested turning on page level compression on the largest of the tables on our report server. Shrunk the index size by 5 gigs.
 

KLin

Lifer
Feb 29, 2000
30,441
752
126
Tested the maintenance job on our reporting server, and the job just took 3 hours and 45 minutes by keeping these tables from updating statistics. w00t!
 

KentState

Diamond Member
Oct 19, 2001
8,397
393
126
Honestly, SQL Server manages stats automatically. It's not perfect, but unless you are seeing performance degradation between maintenance windows then I would cut back. Sometimes DBA's feel that a formulaic approach is how you handle all installations when that's the farthest from the truth. For your scenario, I would consider running backups, doing the quarterly update, then the maintenance plan to check db, rebuild stats and rebuild indexes followed by another backup.
 

KLin

Lifer
Feb 29, 2000
30,441
752
126
Thanks for the input. This is a vendor db by the way. I've noticed they use index heaps on a lot of tables and the heap fragmentation on a lot of the tables are in the high 90's percentage wise.

Bad, yes?
 

QuietDad

Senior member
Dec 18, 2005
523
79
91
Have been in several situations with large mission critical data bases where we used to BCP or backup the large tables out to another disk or tape, drop the table and indexes, redifine the and restore them or BCP them back in. Sometime much faster that update stats.