What causes freezing in GT200 cards?

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Hey guys, I'm overclocking my new GTX260 and i have a few questions for the gt200 owners out there.

It's a 65nm GTX260 zotac. Testing at 783 / 1566 / 1200 core / shader / memory

Is there any ratio besides 1:2 for the Core:Shader clocks?

If I'm overclocking, is the temperature of the shader domain causing freezes?


- I've noticed that when the core reaches ~78C the card locks up. I've tried voltages of 1.06v through 1.23v. Would i have better luck with lower temperatures on the shaders, or just pushing more volts through the core?

I can do 752 / 1512 / 1200 @ 1.06v stable. 783mhz core gives me crashes when the shader temps reach ~78C, even while using a high voltage of 1.23v.

It would be nice to leave the shader domain @ 1400mhz while i take the core on up, i feel like it has a lot of room left.


Thanks for readin'!
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Nice oc but yea 1550+ on shader is quite aggressive at any voltage isn't it? I think too much on shader oc is where most lock ups occur.

Is it stable with settings in your sig? They seem about perfect..
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Yah its working at low volts at 756 / 1512. Is there any way to get a different core / shader ratio?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
The freezing while overclocking is a built-in throttling mechanism, probably triggered by heat but that temperature is *NOT* reported by any of the various software programs like GPU-Z, Precision, RT, etc.

You can clearly see what happens if you download the latest GPU-Z and monitor the Sensors tab while overclocking. You'll see the VDDC current is cut when the GPU throttles due to heat, even when the GPU and PWM sensors are showing good temps in the 70s. Increasing voltage doesn't help necessarily, as heat is the bigger problem with shader clocks:

EVGA Tech replies

If you want to change the core/shader ratio lower than 2x I believe RT allows you to, although I haven't messed with it in some time. You may get an extra strap or two on the core, but I'm not sure if its worth it. In this case, increasing voltage with the EVGA tool would help, but will also certainly increase your core temps.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Ah thanks, chizow. right now it looks like im sticking with 756 / 1512 / 2430 @ ~1.18v

It looks like watercooling is my only hope since temperature is my main limitation, given that i cannot change my core:shader ratio. Stupid shader domain!
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Yea those speeds seem on par with good oc's.

So you went green aye? Great prices out there on 260's, hope you got a good deal...
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: jaredpace
Ah thanks, chizow. right now it looks like im sticking with 756 / 1512 / 2430 @ ~1.18v

It looks like watercooling is my only hope since temperature is my main limitation, given that i cannot change my core:shader ratio. Stupid shader domain!
Ya lower shader clocks on GT200 have been holding it back since launch, but it looks like the 260s and 55nm parts do much better with shader clocks. I've seen factory OC'd 285s in the 1600-1700 range, which means 800-850MHz ceilings on the core.

Also, looks like the 2.0x ratio is a driver limitation that can't be overridden:

  • Old Reference Post on Guru3D
    For example, on my 8800GTX the driver simply refuses to set the clocks with shader/ROP ratio within 1.0 ? 2.0 range (default ratio is 1350/575 = 2.34), but it accepts the clocks programmed with ratio within 2.3 ? 2.5 range.
I remember now, back then we had to use this tweak before RT supported individual clock domains and we wanted to change the ratio lower than 2.3, but limited to 2.0 on G80 parts. I don't use RT anymore but I've read some recent hints that its still hard-locked to 2.0x minimum. Wouldn't hurt to try though, its easy enough just go to Power Users > Settings and change "ShaderClockRatio" to a value lower than 2.
 

Painman

Diamond Member
Feb 27, 2000
3,728
29
86
1512 on the shader domain is pretty good for air, I think. My 260 is on water and 1512 will destablize once the core gets up to peak heat levels (47-48 C for me). Not the greatest H20 temps, I know, but the loop was conceived more for quiet operation and I'm asking a lot of it right now (cooling off a Quad CPU as well). I can run 729/1458 without any trouble though.

In any case... if you can run 756/1512 stable on air, that's some some good silicon you got with your card. 1566 stable on water sounds like a definite possibility.

Damn step-ups... those 285s are really tempting me.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
well painman, you know I'm voltmodded as well. stock is 1.06v 3d, i'm at 1.16v right now.

I'm sure you could go past 800mhz (with more voltage) if you're at 48c load. I've tested 820mhz only to have it artifact/crash.
 

Painman

Diamond Member
Feb 27, 2000
3,728
29
86
Yep, I tried different voltages on my card... more voltage didn't help me get over that shader wall. Seemed to make matters worse, TBH. I could walk the voltage up step by step, with no improvement, and finally, at or near the full 1288 mV allowed by the eVGA tuner, the GPU would fail.

So, the CW seems to be that temps are key for getting improvements on shader clocks, and I see some truth in that, but getting good silicon to begin with... that's pretty much how it goes with trying to OC anything.

On a brighter note, I don't need any voltage tuning at all to get 729/1458. My max stable RAM clock is the same as yours.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
yah i can't get the ram over 1242 without checkerboard hang-ups. You know to set the voltage while in 3dmode, right? While you're running atitool, or furmark works. If you don't use evga voltage tuner while running one of those two progs, then it doesn't set the voltage. When you go fullscreen in your game, it reverts to stock voltage.

I've found 1.138v-1.150v works fine for 756 / 1512 / 2484 - temps max out around 76C & no crashing in farcry2, cod4, fear2.


edit: GVT actually sets the voltage, but it sets it for 2d mode. I had to find this out the hard way. one way to check for sure is to alt-tab out of your game and check the reading of gpu-z sensors, or RT hardware monitor.
 

Painman

Diamond Member
Feb 27, 2000
3,728
29
86
Yup, I was testing with ATiTool and FurMark. I was also watching VDDC and VRM temps with GPU-Z, and voltage changes were registering properly through that.

Congrats on getting a 260 with some get up and go :) Mine isn't the worst in the world, but I had hoped I would squeeze more out of it via the tuner.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Thanks man, i really thought i could do more, after seeing gpu-z screenshots of 800+mhz floating around on the net. I can take an 800mhz SS too, but that doesn't mean crap as far as stability goes.

Oh well, overall i have to say I'm pleased with the setup. I have room to go SLI now in the future, if i chose to do so.


 

Painman

Diamond Member
Feb 27, 2000
3,728
29
86
Exactly, screenshots are just that... heck, I can run 756/1512 long enough to do a Vantage run w/o any artifacts, but that means nothing compared to gaming for an hour or 2. Fallout 3 will barf immediately at those clocks.

I might go for a 285 closer to the end of my step-up window; I have 65 days left. I want to see if eVGA gets the tuner working for those cards, and also if Swiftech releases a 55nm version of their 'sink before then.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: jaredpace
Thanks man, i really thought i could do more, after seeing gpu-z screenshots of 800+mhz floating around on the net. I can take an 800mhz SS too, but that doesn't mean crap as far as stability goes.

Oh well, overall i have to say I'm pleased with the setup. I have room to go SLI now in the future, if i chose to do so.
Ya that's quite an overclock, I'm sure its much more enjoyable experience than that 8800GS SLI or w/e you were running before, despite the SLI solution being faster "on paper." ;)

Originally posted by: Painman
Exactly, screenshots are just that... heck, I can run 756/1512 long enough to do a Vantage run w/o any artifacts, but that means nothing compared to gaming for an hour or 2. Fallout 3 will barf immediately at those clocks.

I might go for a 285 closer to the end of my step-up window; I have 65 days left. I want to see if eVGA gets the tuner working for those cards, and also if Swiftech releases a 55nm version of their 'sink before then.
Is that a 65nm 260? Still a nice overclock, but a GTX 285 would probably see a nice gain from the volt mod software + water. Again, those shader clocks typically clock much better on those GT200b Rev B3, so your core clock should be limited to voltage and temps first before hitting any hard wall. 65 days is a long time, although upgrading now vs. later won't yield any benefit in price typically, as EVGA/BFG don't typically change their step-up MSRPs.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Yah chizow, i like it better than the 8800gs sli setup. They way they played was different. The gtx260 has lower maximum frames, but is steadier on the low end. It would run cod4 maxed out with low (40-50fps) minimums, but yet very steady, and not too far off from the averages or max frames.

The 8800gs sli setup would have incredible max & average frames, but when 384mb framebuffer became an issue (in huge open outdoor scenes) it would chop down to much lower minimums than the 900mb gtx260 does. I'd rather sacrafice a small bit of avg & max performance to have higher minimums and overall steadier framerates on average.

The 260 also does much better in newer titles like FC2 when using AA. The 8800gs would chop down at 1680 x 1050 with AA on. When i can't stand the performance of the 260, or when new games come out that need more cpu/gpu, I'll just throw in a s775 quad (like a q9650) and a second gtx260.

This should last me until it's time to switch over to an entire new platform ~2 to 3 years.

here's a screenshot of 1.175v. I was able to get it stable at 1.138-1.150v
http://img162.imageshack.us/img162/4523/260ocls5.jpg

here's the rig when i had the 8800gs sli in:
http://www.xtremesystems.org/f...090&stc=1&d=1233706459

edit: This 3200rpm fan sounds like a vacuum cleaner gone berserk. I've come to find out that <1.13v and games crash quickly due to instability in the core because of lack of voltage to the gpu. Also, >1.19v results in a longer period of gameplay (30-45minutes) yet still crashes due to overheating of the shader domain because of the amount of heat generated by the relatively high volts. I have to use Vgpu somewhere in the range of 1.138v ~ 1.175v to be gaming stable and without prolonged gaming equaling overheating. So far, I cannot find a way to get core/shader stable at anything above 756/1512 (the next step being 783/1566) for longer than a couple of minutes in atitool. I'm almost positive that if i had <50C load temps on a watercooled gpu, i could maybe take it a bit higher.


 

Painman

Diamond Member
Feb 27, 2000
3,728
29
86
Originally posted by: chizow
Is that a 65nm 260? Still a nice overclock, but a GTX 285 would probably see a nice gain from the volt mod software + water. Again, those shader clocks typically clock much better on those GT200b Rev B3, so your core clock should be limited to voltage and temps first before hitting any hard wall. 65 days is a long time, although upgrading now vs. later won't yield any benefit in price typically, as EVGA/BFG don't typically change their step-up MSRPs.

It's a 65nm. I went that way due to the aftermarket cooling; the announcement of the voltage tuner was post-purchase so that was a matter of luck (not that it really helped though).

A 285 step-up would cost me about $97 if I did it right now. I keep GPUs around longer than I used to, so it may prove worthwhile. I'm not 100% happy with my min framerates from the 260, even though it's way better than my previous card in almost every way... if I percieve it to be struggling now, that's not going to change down the road, and it's likely that I'll be forever bothered if I don't do it.

See, I've already talked myself into doing it. There's no stopping me now :D
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: jaredpace
Yah chizow, i like it better than the 8800gs sli setup. They way they played was different. The gtx260 has lower maximum frames, but is steadier on the low end. It would run cod4 maxed out with low (40-50fps) minimums, but yet very steady, and not too far off from the averages or max frames.

The 8800gs sli setup would have incredible max & average frames, but when 384mb framebuffer became an issue (in huge open outdoor scenes) it would chop down to much lower minimums than the 900mb gtx260 does. I'd rather sacrafice a small bit of avg & max performance to have higher minimums and overall steadier framerates on average.

The 260 also does much better in newer titles like FC2 when using AA. The 8800gs would chop down at 1680 x 1050 with AA on. When i can't stand the performance of the 260, or when new games come out that need more cpu/gpu, I'll just throw in a s775 quad (like a q9650) and a second gtx260.

This should last me until it's time to switch over to an entire new platform ~2 to 3 years.

here's a screenshot of 1.175v. I was able to get it stable at 1.138-1.150v
http://img162.imageshack.us/img162/4523/260ocls5.jpg

here's the rig when i had the 8800gs sli in:
http://www.xtremesystems.org/f...090&stc=1&d=1233706459

edit: This 3200rpm fan sounds like a vacuum cleaner gone berserk. I've come to find out that <1.13v and games crash quickly due to instability in the core because of lack of voltage to the gpu. Also, >1.19v results in a longer period of gameplay (30-45minutes) yet still crashes due to overheating of the shader domain because of the amount of heat generated by the relatively high volts. I have to use Vgpu somewhere in the range of 1.138v ~ 1.175v to be gaming stable and without prolonged gaming equaling overheating. So far, I cannot find a way to get core/shader stable at anything above 756/1512 (the next step being 783/1566) for longer than a couple of minutes in atitool. I'm almost positive that if i had <50C load temps on a watercooled gpu, i could maybe take it a bit higher.
Nice, sounds like your experiences with low-end SLI compared to single high-end is consistent with the others I've read over the years. But ya those are great overclocks, nearly 200MHz on the core is certainly going to result in serious performance gains.

As for the fan, ya its very loud above 80%, at 100% its pretty insane, but its still better than the smaller fan found on the G80. I personally don't notice the fan until it passes 70%, unfortunately, the AUTO setting never goes that high by itself, typically stopping around 60%. It also ramps too slowly in some games, so I need to set and use a 70% profile for certain games. The Rel 180 drivers not only increased temps (and performance), but also added a driver throttling feature that basically sets clocks to 2D if the GPU exceeds 85C or so. I think they've adjusted this back upwards in 181.22, but I still use a manual 70% profile as I don't like temps at 80C+.

As for water, again, not sure how much more you'll get out of it. Similar to Painman and others, you may just be hitting a hard wall for the GPU around 1500 on the 65nm GT200 and 55nm GT200b B2. The B3 chips used in the 285 and 295 are the ones hitting shader clocks 1600+, which would open up potentially higher core clocks.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Painman
It's a 65nm. I went that way due to the aftermarket cooling; the announcement of the voltage tuner was post-purchase so that was a matter of luck (not that it really helped though).

A 285 step-up would cost me about $97 if I did it right now. I keep GPUs around longer than I used to, so it may prove worthwhile. I'm not 100% happy with my min framerates from the 260, even though it's way better than my previous card in almost every way... if I percieve it to be struggling now, that's not going to change down the road, and it's likely that I'll be forever bothered if I don't do it.

See, I've already talked myself into doing it. There's no stopping me now :D
$97 isn't too bad considering you'll get all the extra transistors, mem bus, and VRAM along with a high potential for clock increases across the board. That should help your minimums, especially in frame buffer/bandwidth limited titles (GTA4, Fallout3, Crysis etc.)
I think the price delta is actually less than the difference between current 55nm 260s and 285s (~$200 AR and ~$320 AR, respectively).