GeForce 6800 Overclocking using "Detect Optimal"

CraigJay

Member
Jan 2, 2002
30
0
0
I've done the "Detect Optimal Frequency" a number of times on my PNY Geforce 6800. The Resulting frequencies ALWAYS follow this equation exactly: Memory Speed = GPU Speed x 2 + 50. I am using the 66.93 drivers.

Do others get the same results? Here are the test results I've gotten so far:
GPU/Mem
394/837
394/841
359/767
373/795
369/787
360/770
353/755
379/808

Through manual testing with Doom3 and other software, I've settled on 400/800. Anyone think I should follow the equation MEM = GPU x 2 + 50?

Craig
 

Spike

Diamond Member
Aug 27, 2001
6,770
1
81
That does not follow with mine. My BFG 6800GT has a stock of 370/1000 and the "optimal settings" put it at 432/1110. I settled on 400/1100 for mine and that seems to work well, though I might take the core up to 420.

-spike
 
Jun 14, 2003
10,442
0
0
will be different for the GT's i presume.....my detects 409 and 1.11ghz

409x2 + 50 doesnt = 1110mhz

it doesnt even = 900!
 

Avalon

Diamond Member
Jul 16, 2001
7,572
182
106
On my 6800NU that seemed to be true.
The auto detect said I could hit a GPU core speed of 363mhz, and mem speeds of 776mhz, which is exactly GPU x 2 + 50. It's rather funny, since I can't go over 360mhz stable on the core, but can get up to 900mhz stable on the memory. Buh. I'm only going to trust my own testing.