• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

9800 GTX+ overclock

chasmanz28

Junior Member
Hi everyone, i have 2 9800 GTX's in SLI, can someone tell me what core speeds i can use to get the most out of these cards thanks. Im useing the precision from EVGA. Core clock, shader clock, memory clock settings please. And should i have the core clock linked to the shader? Stock speeds are CC is 738, SC is 1836, MC is 1100.
 
You have to find it yourself. It's dangerous to use clocks that someone else used, because your card(s) might not be stable at what his card was.

Keeping the shaders unlinked, will give you a couple of extra mhz on them, opposed to when linked, but not such a big deal.

Overclock your cards using the long old school method, upping the clocks a couple of mhz at a time and making sure it is stable, using ATi tool artifact scanner.
 
Played COD at high settings. Couldnt get that test to work, so i just played that game until it didnt crash. On a side note is it better to let the 3D application decide, or is it better for me to use my own preference?
 
ATi tool artifact scanner sees artifacts that you may not be able to see it in games. You have to make it work somehow, try another ATi tool version. If it passes one hour artifact free, then you card is almost stable and you can test games on it.
 
Originally posted by: chasmanz28
Played COD at high settings. Couldnt get that test to work, so i just played that game until it didnt crash. On a side note is it better to let the 3D application decide, or is it better for me to use my own preference?

COD4 is not very graphics intensive. Download 3dmark06 and loop it for 30-40 minutes. If it crashes, clock down.
 
Yowza. I think I remember some of the 3870's from the past gen hitting about 2600-2700 with DDR4. That is a crazy good overclock on DDR3.
 
Originally posted by: chasmanz28
Did my test. Best i got was 792 CC, 1970 SC, 1351 MC. How does that stack up pretty normal?

That's pretty high on the shader and memory. I've tested a bunch (dozens?) of 9800 GTX+ and they have a tough time getting more than another notch or two on the shader. Sure, they will seem to work fine in games, but they won't pass ATI Tool.
 
what can I say...The memory clock is very impressive, the core and shader, not so much. I got my 9800 GTX 65nm to 800/2050/1225, stable. Fan speed at 100. you are upping fan speed, right? (though not necessarily to 100)
Try upping the shader clock to around 1900-2050, and the core clock as well bits at a time. They should go further, though I am thoroughly surprised at the high MC.
 
Originally posted by: Shmee
I got my 9800 GTX 65nm to 800/2050/1225, stable. Fan speed at 100.

Yeah, I can get higher clocks by cranking fan speed too, but personally I would not accept that tradeoff. I'm responsible for my own ears as you are for yours. 😛
 
What version of ATITool do you guys use? I've just been using Furmark and doing a visual inspection for a minute or 2, manually looking for any glitches, artifacting, etc.
 
Back
Top