Some questions regarding overclocking a 8800GTS 512

ZebuluniteV

Member
Aug 23, 2007
165
0
0
Hi, I recently jumped on the Zotac 8800GTS 512 deal at fxvideocards. I've been out of the loop for a while on overclocking the latest cards (my previous card, a 7900GS, is obviously significantly different than the unified architecture of the Geforce 8000 series and beyond).

Anyway, I briefly used the Nvidia Control Panel to overclock the core, memory, and shader on my card, while running the ATITools stress test to check for any errors.

While testing each individually (leaving the other two at stock setting), I achieved a maximum core setting of 812MHz, memory 1105MHz, and shader at 1963MHz ? maximum being defined as the highest setting where I did not run across any graphical corruption (yellow dots, etc) on the ATITool window. Granted, I didn't leave the stress test running too long on these ?max? levels, so they're probably closer to the highest settings that did not result in any errors in the first minute or so.

After reaching those ?max? levels, I then overclocking all three components at the same time, to the following results.

Error @ 803/1105/1963
Error @ 803/1087/1963
Error @ 794/1062/1941
Error @ 785/1044/1918
Fine @ 776/1027/1896 (apparently, didn't see any yellow dots pop up in ATITool after running for several minutes)

So it appears that last overclock setting above is my highest stable (for several minutes anyway) overclock setting for the card. Still, I imagine it would not be a very good idea to regularly run the card just below an overclock that produced graphical corruption. Which brings me to several questions:

First off, I was wondering if there is any approximate ratio I should stick to between core and shader speeds. I know a while back Nvidia's software (and probably most other overclocking tools at the time) would not let you individually adjust core and shader speeds, but instead adjusted both to maintain some specific ratio. I thought I had read at some point that overclocking one proportionally significantly greater than the other resulted in little performance gains, but that may just be my imagination.

Beyond that, how do users here limit their max overclocks for general use while gaming? Any basic rule of thumb to follow?


Anyway, hopes that's all legible after a long day at work. Thanks for your help.

Edit: In case it matters for some reason, I'm using 177.66 on Vista 64-bit.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
There is no core/shader ratio. You overclock them individually and see how far you can get them stable. I don't limit my maximum overclock, I just find it and test it hard: one hour of ATI tool stress test and if it passes that, then I throw hours and hours of intensive games on to the card. I usually used to redone the testing when the ambient temperature suffered a big increase, but not anymore since I have AC.
 

Jax Omen

Golden Member
Mar 14, 2008
1,654
2
81
I can definitely attest to the ambient temperature issue. I have to run stock until the summer passes V_V
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
dump th 177.66 drivers and go to the 177.41, which are more bug free. Before you go any farther setup a good auto fan control setting. Then clock the core > shader > memory on at a time.

Use rivatuners Hardware monitor to see what the clocks really are.

1963 shader = 1944mhz
1941 shader = 1944mhz
1918 shader = 1944mhz
1896 shader = 1890mhz

Looks like 1890mhz shaders works fine. Now keep going to you find your max core and memory. Hope this helps

Edit

While you at it you should create a preset launcher in Rivatuner. I currently have 2

Max Power which overclocks and changes the auto fan controls ( 70c max )
Power Saving which underclocks and put the fan controls back to default ( 37% fan speed )
 

ZebuluniteV

Member
Aug 23, 2007
165
0
0
Originally posted by: error8
There is no core/shader ratio. You overclock them individually and see how far you can get them stable. I don't limit my maximum overclock, I just find it and test it hard: one hour of ATI tool stress test and if it passes that, then I throw hours and hours of intensive games on to the card. I usually used to redone the testing when the ambient temperature suffered a big increase, but not anymore since I have AC.

Hmm, well I guess I just imagined the core/shader ratio then.


In any event, I played with the OC a bit more after finding that nTune lets me type in specific clock speed (verses dragging the slider to arbitrary values). I got the following results:

Error @ 750/1100/1850
Fine @ 750/1075/1850
Fine @ 750/1075/1900
Error @ 750/1075/1925
Fine @ 800/1075/1900
Error @ 825/1075/1900


So the max "stable" OC seems to be 800/1075/1900.

Throughout the testing, I left the stock dual-slot cooler on 67%, and the highest GPU temperature I've seen (in GPU-Z anyway) is 57C - in the air-conditioned basement, that is.
 

ZebuluniteV

Member
Aug 23, 2007
165
0
0
Originally posted by: SSChevy2001
dump th 177.66 drivers and go to the 177.41, which are more bug free. Before you go any farther setup a good auto fan control setting. Then clock the core > shader > memory on at a time.

Use rivatuners Hardware monitor to see what the clocks really are.

1963 shader = 1944mhz
1941 shader = 1944mhz
1918 shader = 1944mhz
1896 shader = 1890mhz

Looks like 1890mhz shaders works fine. Now keep going to you find your max core and memory. Hope this helps

Edit

While you at it you should create a preset launcher in Rivatuner. I currently have 2

Max Power which overclocks and changes the auto fan controls ( 70c max )
Power Saving which underclocks and put the fan controls back to default ( 37% fan speed )

Huh, well that's strange that Rivatuner would report true values while GPU-Z and nTune would not. But that does make sense, since if I understand right GPUs (and all clocked processors in general) base their clock speed off a multiple of the vibratons of a crystal oscillator. Therefore, while nTune might let me "overclock" at any arbitrary value I choose, in reality its rounding the OC at the closest multiple of the oscillator MHz. Or something like that...


Well, anyway, Rivatuner reports my "max" OC, shown in GPU-Z as 800/1075/1900, as 799.2/1080/1890. And it reports my 50MHz lower F@H test setting, 750/1025/1850, as 756/1026/1836.


Going back to the original question then, is it "safe" to run the 8800 at the highest "stable" clock speeds (or highest stable multiple of the oscillator), or should I back it down to the next highest multiple? I don't game a ton, so damaging the GPU by pushing it to the limits in gaming probably isn't a huge concern, but I have been running the new GPU version of Folding@Home frequently (and when I get back to college, 24/7), and obviously I wouldn't want to risk instability popping up while leaving that running.



Edit: One interesting tidbit I discovered while playing around with nTune and RivaTuner is that while the "real" core and memory clock speeds change in steps of 10 or so, no matter what I set them to in nTune I can't find any middle ground on the shader between 1836 and 1890.

Edit 2: From a setting of 1864 in nTune up to 1917 on the shader, the real clock remains at 1890, with 1863 changing it to 1836, and 1918 changing it to 1944.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
It's not going to hurt if your temps are fine. As long as game play is fine and doesn't have problems. If you notice a problem go back and change the settings a little till they than problem stops ( driver stop responding, artifacts, or what ever ).

Setup 3 profiles in rivatuner

Folding = default clock speeds and fan speeds
Gaming = max oc with higher fan speeds
Windows = uc with default fan speeds

Again just keep the temps low and your fine.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: ZebuluniteV
Originally posted by: error8
There is no core/shader ratio. You overclock them individually and see how far you can get them stable. I don't limit my maximum overclock, I just find it and test it hard: one hour of ATI tool stress test and if it passes that, then I throw hours and hours of intensive games on to the card. I usually used to redone the testing when the ambient temperature suffered a big increase, but not anymore since I have AC.

Hmm, well I guess I just imagined the core/shader ratio then.


In any event, I played with the OC a bit more after finding that nTune lets me type in specific clock speed (verses dragging the slider to arbitrary values). I got the following results:

Error @ 750/1100/1850
Fine @ 750/1075/1850
Fine @ 750/1075/1900
Error @ 750/1075/1925
Fine @ 800/1075/1900
Error @ 825/1075/1900


So the max "stable" OC seems to be 800/1075/1900.

Throughout the testing, I left the stock dual-slot cooler on 67%, and the highest GPU temperature I've seen (in GPU-Z anyway) is 57C - in the air-conditioned basement, that is.

You have a good oc and a great temperature. Enjoy those extra fps ;)
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
They're really good temps, but I doubt those are max out temps. Also I would never use one solid fan speed unless it was 100%. Setup a low-level fan auto control is much better as it can adjust the fan up to 100% if needed.
 

ZebuluniteV

Member
Aug 23, 2007
165
0
0
Well, now that I'm off from work for today, I've had time to run ATITool more extensively. My previous setting, 799/1080/1890 (the real values) resulted in an error in ATITool after about 4 minutes or so, with the GPU reaching a max temperature of 61C (at 67% fan speed - ~1137RPM). Reducing the core to 783 allowed me to run ATITool for 10 minutes without any errors, at which point I became impatient and had to try out a game. Running HL2:EP2 maxed out in windowed mode with GPU-Z showing the current temperature in the background, I observed no artifacts (or at least no noticeable ones), with a max temp of about 53C. I'm currently letting the GPU run F@H right now, with the memory decreased slightly to 1053 to help guarantee its stable. At the moment, the GPU temperature is reported as 58C.
 

ZebuluniteV

Member
Aug 23, 2007
165
0
0
Originally posted by: SSChevy2001
They're really good temps, but I doubt those are max out temps. Also I would never use one solid fan speed unless it was 100%. Setup a low-level fan auto control is much better as it can adjust the fan up to 100% if needed.

Is there any better way to max out the temps than ATITool? As noted above gaming, or at least EP2 in windowed mode, didn't reach as high temps as ATITool, and Folding@Home reaches approximately the same temps as ATITool.

 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: ZebuluniteV
Originally posted by: SSChevy2001
They're really good temps, but I doubt those are max out temps. Also I would never use one solid fan speed unless it was 100%. Setup a low-level fan auto control is much better as it can adjust the fan up to 100% if needed.

Is there any better way to max out the temps than ATITool? As noted above gaming, or at least EP2 in windowed mode, didn't reach as high temps as ATITool, and Folding@Home reaches approximately the same temps as ATITool.

Yes there is. Use FurMark, stability test with 32X AA. That should really max out your temperature.
 

SSChevy2001

Senior member
Jul 9, 2008
774
0
0
Yeah try furmark out if you want to see max temps. I max out at about 69c with fan about 93%, but that's cause I set my auto fan control to max at 70c. Usually while gaming though it's only around 60% - 70% fan speed.
 

ZebuluniteV

Member
Aug 23, 2007
165
0
0
Originally posted by: error8
Originally posted by: ZebuluniteV
Originally posted by: SSChevy2001
They're really good temps, but I doubt those are max out temps. Also I would never use one solid fan speed unless it was 100%. Setup a low-level fan auto control is much better as it can adjust the fan up to 100% if needed.

Is there any better way to max out the temps than ATITool? As noted above gaming, or at least EP2 in windowed mode, didn't reach as high temps as ATITool, and Folding@Home reaches approximately the same temps as ATITool.

Yes there is. Use FurMark, stability test with 32X AA. That should really max out your temperature.

Oh right, I should have thought of FurMark. I just downloaded it and will see what happens.
 

ZebuluniteV

Member
Aug 23, 2007
165
0
0
I left FurMark running for about an hour running windowed (so that I could have GPU-Z out for temps on the side) at 1280x1024 w/ 32X AA. I bumped up the fan speed to 80% (~1380RPM) to be safe, and saw a max temperature of 59C. The test ran with the GPU clocked at 783/1890/1053 (forgot to readjust the memory speed back up to 1080, though that's no big deal).

So, it would seem the current overclock is stable. I'll leave it running F@H overnight with the OC, and see how that plays out. Thanks everyone for your help with overclocking the 8800! :)