• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Overclocking Video: GeForce 6800 GTO

Yoshi911

Senior member
I have a 6800 GTO, it's basically just an overclocked version of a Vanilla. I'm using RivaTuner to OC it but have it at about 420\800 atm in Riva tuner.. does this equal 1600mhz??? or is it a basic 400mhz making it 800mhz anyone have some answers? sorry about being so stupid 😛 lol.
 
BUMP, GIVE ME INFORMATION! 😛 does rivatuner report the DUAL DDR speed or just basic.. aka if it says 800mhz am I running it at 400mhz or 1600mhz
 
If what you want is a stress test and not a benchmark, give RTHDRIBL a try. It's about the best I've been able to find for this purpose.
 
Originally posted by: Yoshi911
BUMP, GIVE ME INFORMATION! 😛 does rivatuner report the DUAL DDR speed or just basic.. aka if it says 800mhz am I running it at 400mhz or 1600mhz

oops.

i didn't address this.

i'm pretty sure that if it says 800, it's running 1600.

with atitool it says 400, it's actually running 1600.
 
OK, so I think I understand..

Rivatuner repots Half of the Dual speed DDR? aka 800mhz is like my Ram running at @217mhz but actually is DDR434 ??

Does this sound realistic though?? for an overclocked 6800???
 
sweet thanks. Sounds like an excellent idea... SOO will this stress my GPU\RAM more than BF2 ?? And if it has no artafacts then that OC is stable?
 
the bigger the window you run RTHDRIBL in, the more heat you'll generate

fullscreen or similar will generate far more heat than any game

example: BF2 my card loads@58, and loads @60 COD2, RTHDRIBL gives me 64-65C

if it's not artifacting after an hour or two of RTHDRIBL, you are definitely going to be stable in games
 
Originally posted by: the cobbler
the bigger the window you run RTHDRIBL in, the more heat you'll generate

fullscreen or similar will generate far more heat than any game

example: BF2 my card loads@58, and loads @60 COD2, RTHDRIBL gives me 64-65C

if it's not artifacting after an hour or two of RTHDRIBL, you are definitely going to be stable in games


Although i think rhtbl does not test the ram on the gfx card as much as games do. But for testing gpu overclocks and max temp this program is the best, also the skull one is the most stressfull, press "O" when ur running the prog and it will switch objects.
 
Originally posted by: Dark Cupcake
Originally posted by: the cobbler
the bigger the window you run RTHDRIBL in, the more heat you'll generate

fullscreen or similar will generate far more heat than any game

example: BF2 my card loads@58, and loads @60 COD2, RTHDRIBL gives me 64-65C

if it's not artifacting after an hour or two of RTHDRIBL, you are definitely going to be stable in games


Although i think rhtbl does not test the ram on the gfx card as much as games do. But for testing gpu overclocks and max temp this program is the best, also the skull one is the most stressfull, press "O" when ur running the prog and it will switch objects.



very true...RTHDRIBL lets me hit ~1415mhz on mem....only 1389mhz is 3DMark stable, and I have to drop that another ~25mhz for stability in games...
 
I would get 3DMark06. I don't think anything out there, including Oblivion on max settings, slams the GPU as hard as the "Canyon Run" sequence.
 
Doom 3 has brought out instabilities in video card overclocks that haven't shown up in any other game for me. I'd recommend that if you want to test for stability. Tearing can be difficult to see because it's such a dark game. If you don't have Doom 3 I've found the Dragothic test in 3DMark2001 shows tearing and artifacting related to memory being unstable. The nature test in 3DMark2003 also shows artifacting/sparkling related to the GPU being clocked too high.
 
yeah RTHDRIBL really puts a huge stress on the card. i was testing this on an old computer and I would constantly get blue screens and a really hot power supply after a while.
 
Originally posted by: Jeff7181
Doom 3 has brought out instabilities in video card overclocks that haven't shown up in any other game for me. I'd recommend that if you want to test for stability. Tearing can be difficult to see because it's such a dark game. If you don't have Doom 3 I've found the Dragothic test in 3DMark2001 shows tearing and artifacting related to memory being unstable. The nature test in 3DMark2003 also shows artifacting/sparkling related to the GPU being clocked too high.


For me, it was F.E.A.R. I had to drop the core and mem about 5 mhz each on my overclocked X800 GTO or I'd get visual anomalies.
 
nope I got this off a guy who got it out of his Dell 😛 I'm hoping it will die someday so I can get it replaced by the latest and greatest 😛 kinda like the 1gb OCZ VX Gold pc3200 ram I bought from a guy off here for $80 that OCZ replaced for their 2gb Platinum set 🙂))

----
EDIT: As far as whats on the GPU die I have no idea, still has the heatsink on and have'nt looked around on the actual chip
 
ah. your sig says BFG, so I was wondering.

If the shim says quadro, and you ever want to get rid of it, I'll take it off your hands for a fair price.
 
welll.... Little problem, help someone? When ever I try and start RTHDRIBL it crashes, it just says "xxx.exe has encountered a problem and needs to clos. We are sorry for the incovenience."

I completely reset the drivers to factory defaults and still nothing
 
Originally posted by: Yoshi911
I am thinking about getting rid of it, why do you ask? wussup with quadro? and whats the shim?

much like how some of the 6800 LEs were actually "failed" ultras - some of the dell gtos are actually quadro 3400s that didn't have enough working pipes to be sold as quadros.

I'd like to get my hands on one more out of curiosity than anything - I've seen reports they can be flashed back to functioning quadros with no other modding, but nothing from definite and reliable sources. even if it doesn't work, I could use a cheap secondary pci-e card anyway when I assemble my woodcrest system in a few months. the cards are probably old (and cheap) enough now that it'd be worthwhile to try it out.

if it's a quadro, it'll be printed on the processor shim:

http://www.rpi.edu/~dengd/OEM68002.jpg

non-quadros will have "geforce 6800" or something along those lines instead.
 
If I take off the heatsink to check this do you think it would be detectable by (Void my warrenty) by BFG if I ever want to RMA it?
 
Back
Top