Control panel clocks vs. real clocks on BFG 7800GTX

Xentropy

Senior member
Sep 6, 2002
294
0
0
First off, if anyone knows of multiple programs I can use to detect the clockspeeds of my GPU and video RAM, please let me know. For now I'm just using the nVidia control panel (NVTweak/Coolbits) and 3DMark03/05. One issue is 3DMark will randomly report the 2D instead of 3D speed settings, I guess depending on the exact moment it takes its snapshot, so I can't always rely on its numbers.

The issues I'm getting are:
1) With control panel reading stock (460/1300), 3DMark reports 450/1305.
2) With "stock", I'm getting much lower '05 scores than I would expect.
3) With control panel reading a modest 2.5% overclock of 472/1300, 3DMark reports 513/1305!
4) With that overclock, my 3DMark03 scores jump from 12099 to 17067 (including a whopping jump from 6280 to 10488 multitexturing fill rate!), and 3DMark05 jumps from 5654 to 7429. This would seem to back up the clocks actually changing by much more than the control panel reports.

Now the 472 or 513 overclock, whichever it is, is luckily fully stable and GPU temps never go above 62 degrees or so (if I can even trust THAT readout in my control panel :p). No artifacts of any kind whatsoever even burning '05 in three runs in a row. But I'd really like to know what my actual clocks are rather than shooting in the dark, and I'd also like to know why my stock clocks are only scoring me ~5600. Yes, I have an Intel instead of an AMD processor, but a 3.6Ghz should give me low-to-mid 6000's at least, based on the few scores I've seen from other Intel users.

Don't get me wrong, I'm not upset about any of these scores. Coming from a PCX5750, even the 5654 in '05 is a nice jump from my old 619. ;) But is the 7429 I get with "472/1300" really 472, or is it really 513, or is my true clock somewhere between the two values? And are my stock speeds actually running 450 if 3DMark is the correct one?

Something's not adding up, and I'd like a third program to check my clocks with and sanity check the data. Thanks in advance.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Wow, NO ONE ELSE is having any issues? There seem to be quite a few people over in the nVNews forums noticing this. This could have huge implications on benchmarks with these cards, since according to the research of some people at nVNews:
A) Clockspeeds can vary wildly from card to card at the same supposed settings.
B) This issue is not limited to just BFG cards, nor just certain BIOS revisions.

To recap, my system scores 25% lower than it should at "stock", and then a "2.5%" overclock makes scores jump 30-40%. Who knows what actual clocks various sites were running at when they benched. This also explains Ronin's benchmarks at 535/1417, which probably really WERE at 535/1417 even though he thought he was stock.

I'm surprised no one else is concerned about this. I mean, the good thing is these cards are apparently perfectly stable even into the low-to-mid 500Mhz range on stock cooling. The bad thing is there's no way of really knowing what clocks you're running since the control panel, and even stock settings if you never touch the clocks, may be quite inaccurate.
 
Jun 14, 2003
10,442
0
0
lavalys everest home edition is a good one, this will tell you even how many transistors its got lol!

google it

and i believe that 7000+ scores in 05 will almost certainly be near stock speeds,

a BFG 7800GTX should, with the latest drivers...be just nudging around 8000
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Remember that's processor dependant. But also note that even if 3DMark's speed reports are correct, I score 5650 at 450/1305 and 7550 at 513/1305. That's a wider score variation than even the clocks should garner. And the fill rate alone almost doubles. All from a control panel change of supposedly 12Mhz. So I wonder if my stock is even the 450 3DMark claims it is.

Of course, I'm also curious, do people set their control panel quality settings on high performance to benchmark? I left it on the default Quality setting. That could hurt my score against folks who don't bench to find out what kind of gaming they can enjoy and just bench to brag.

Edit: Well, not surprisingly, Everest doesn't have any idea what a 7800GTX is yet and won't give me a GPU report.
 

Cheesetogo

Diamond Member
Jan 26, 2005
3,824
10
81
Originally posted by: Xentropy

Of course, I'm also curious, do people set their control panel quality settings on high performance to benchmark? I left it on the default Quality setting. That could hurt my score against folks who don't bench to find out what kind of gaming they can enjoy and just bench to brag.

Youm should always set your settings to high performance.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Originally posted by: Cheesetogo
Youm should always set your settings to high performance.

Thanks, I suspected that was SOP but never messed with it. I'll rebench on "stock" and "472" real quick again.
 

knyghtbyte

Senior member
Oct 20, 2004
918
1
0
try checking it on rivatuner......dunno if support for your card is available for that yet but if it is its usually reliable, especially the temp monitor bit......
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Changing from Quality to High Performance had zero effect.

Not having any troubles with temps. Everest can read the temps, but those are accurate in the control panel too. They go up as high as 65 or so when burning in a lot, but haven't seen higher than that. Idle/2D temp around 54.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Nothing conclusive. Now I can't get the stock clock issue to come back. Going into coolbits and restoring defaults goes back to reporting 460/1300 there, but 3DMark reports 501/1305, and my 05 score is in line with the 513/1305 score, in the vicinity of 7500.

Basically, it's looking like there was an issue with the stock clocks on my card as-shipped, and even reclocking off-stock and back onto stock has fixed it. I still can't guess which is reporting the correct clocks, though. The 3DMark scores and clocks match up better, but 3DMark could still be reporting 41Mhz too high in all cases or something (since 472-460 = 513-501, the clock difference matches up now at least, if not the speed itself).

Until we get more reports from more 7800GTX owners, I can't know if this issue is completely unique to my card or whether there are more cards out there with something holding back their true power until the clocks are changed, even if they're changed right back to default. I'll continue to research it and see if I can bring back the problem, thereby possibly figuring out what caused it in the first place.

Edit: I set coolbits to 409/1300 and indeed got 3DMark05 reporting 450/1305 again...but my score was 7216!! How on earth I got only 5600 before I did anything with coolbits I have no idea. I ran the bench 5 times before, with a reboot between the third and fourth, just to confirm the results and everything, and 5654 was the *best* score I got. Now I get 7216 on my first run with 409/1300 clocks?! Every answer just raises more questions. :)
 

BroadbandGamer

Senior member
Sep 13, 2003
976
0
0
Originally posted by: Xentropy
Nothing conclusive. Now I can't get the stock clock issue to come back. Going into coolbits and restoring defaults goes back to reporting 460/1300 there, but 3DMark reports 501/1305, and my 05 score is in line with the 513/1305 score, in the vicinity of 7500.

Basically, it's looking like there was an issue with the stock clocks on my card as-shipped, and even reclocking off-stock and back onto stock has fixed it. I still can't guess which is reporting the correct clocks, though. The 3DMark scores and clocks match up better, but 3DMark could still be reporting 41Mhz too high in all cases or something (since 472-460 = 513-501, the clock difference matches up now at least, if not the speed itself).

Until we get more reports from more 7800GTX owners, I can't know if this issue is completely unique to my card or whether there are more cards out there with something holding back their true power until the clocks are changed, even if they're changed right back to default. I'll continue to research it and see if I can bring back the problem, thereby possibly figuring out what caused it in the first place.

Edit: I set coolbits to 409/1300 and indeed got 3DMark05 reporting 450/1305 again...but my score was 7216!! How on earth I got only 5600 before I did anything with coolbits I have no idea. I ran the bench 5 times before, with a reboot between the third and fourth, just to confirm the results and everything, and 5654 was the *best* score I got. Now I get 7216 on my first run with 409/1300 clocks?! Every answer just raises more questions. :)


Which mobo and 7800 GTX do you have? I'm getting ready to put my system together right now. I'll get back with you in a few hours and let you know.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
BFG, as the title of the thread states. And I'm on an Intel system, so I expect a little bit of a score penalty compared to the AMD folks (just not the >25% I was getting at first! ;))

Mobo is an Asus P5AD2 Premium.

Make sure you do some benches before even installing Coolbits or anything else, then throw on the Coolbits registry entries and have at the clocks a bit. Also let us know what clocks are reported by 3DMark and by Coolbits. My odd low scores to start out may have just been a fluke, since no one else has mentioned having that issue but me. At least it's stopped now and my stock scores are where I figured they originally should be.
 

elsupremo

Junior Member
Jun 30, 2005
17
0
0
Hi there,
I just got my XFX 7800GTX a couple of days ago. The stock clocks on mine are supposed to be 450/1300. However, When I go into 3dMark05, the details show my clocks at 491.4/1300. I have not even enabled coolbits, and have no other OC software on my system. These are the stock speeds the card came with. (Perhaps 3dMark does not support the 7800 yet? I don't know).

Regardless, I got 8,024 on my first 3dMark05 test. I have FX-55.

I too am curious about the discrepancy between what is supposed to be stock speeds and what 3dmarks shows. What is accurate?
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
It's looking more like 3DMark just reports 41Mhz higher than actual GPU speed. I still haven't been able to duplicate the issue I was having before doing anything with my clocks, though. Even setting everything default, uninstalling Coolbits, cleaning the drivers and reinstalling them from scratch, etc, leaves my stock where it should be now, i.e. 460, or 501 according to 3DMark.
 

stnicralisk

Golden Member
Jan 18, 2004
1,705
1
0
There was a problem like this in the past when a new card came out. It is a fault with 3dmark or at least in the past it was. Get coolbits and check your speed.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Originally posted by: stnicralisk
There was a problem like this in the past when a new card came out. It is a fault with 3dmark or at least in the past it was. Get coolbits and check your speed.
Didn't really read the thread, did you? My original issue was Coolbits was reporting 460/1300 yet my scores were indicating a sub-400 clock and 3DMark, even correcting for its incorrectness, was reporting I was running about 410.

Now, at least, Coolbits seems to be correct, but it wasn't before I changed anything the first time, for some reason.
 

elsupremo

Junior Member
Jun 30, 2005
17
0
0
Question: When I enabled Coolbits, 3dMark started showing my "Standard (2D)" setting of 275Mhz Core Clock speed instead of the 450 mhz setting. However, when running the benchmark I see it is clearly at the faster speed. I just wish 3dMark would show the correct (Performance (3d)) setting. It sometimes will show the correct speed later, but after a restart, it shows the 2D speeds again. Anyone know why this is?
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
It just depends on when it takes the snapshot of the speeds. I've found 90% of the time it does it right and shows the 3D speeds, but now and then it'll wait until too long after the test finishes (or maybe checks too soon before the test starts) and your system's already back at 2D (or issn't at 3D yet) when it checks the speeds.

To have it just check your speeds for you, you can run a test, hit Esc immediately, and then go into the info panel. It should show you the speeds it thinks you're at. If you get 2D ones, just do it again. Faster than running a whole test, anyway.
 

xl80325

Member
Jan 5, 2003
79
0
0
If you use rivatuner to log the core speed when it is fully loaded(like 3dmark or 3d game), you can see it is actually around 500 in stead of 460. I tried this for my BFg 7800 gtx on 3dmark 05 and half life 2. Maybe we should ask BFG about this. Do they secretly set core speed to 500 in 3d mode?
 

WA261

Diamond Member
Aug 28, 2001
4,631
0
0
Posted this at Overclockers the other day. Nvidia is known for this. THey say it is one speed and it is actually another. It makes the ir card look faster then it really is.

:p

I am forcing the card to run at the advertised core clock
speed not the cheat speeds nVidia is running the cores at in 3D mode (about 40Mhz higher that
what bios is programmed for or what you set the core clock at manually ). nVidia is also defaulting
the driver to run at a "Quality" IQ setting instead of "High Quality" as a high end card should be
running at (and what ATI does BTW).

This BFG OC'ed card is advertised as a 450/650 . The bios is programmed for a 460 core clock
in 3D and the that is what both nVidia's an RivaTuner default to on the manual clock sliders (as they
should BUT when the card goes into 3D mode the core actually runs at 501Mhz or 41Mhz higher
than it is supposed to be or what you think the core will be running at.

The 3D core clock is actually set by the drivers. They read the bios to see what the 3D core clock"
is programmed for or what you are manually setting the core clock (which over rides the bios
programmed clock) and then sets that clock when the card goes into 3D mode.

Through either an error in the driver code, or a deliberate act by nVidia, the drivers are taking the
bios programmed 3D clock, or what you over ride it to manually, and adding 40Mhz to that number
and then setting the core clock genrator to the higher speed without you knowing it. I don't it is a
driver code error as NV did this before to make their cards appear faster to reviewers than they
really were at the advertised clock speeds. This behind the scene (read behind your back and
without your knowledge) 3D clock cheating appears to be confined to just the 7800's so that is
another reason to think this is deliberate by nVidia and not an accidental driver code error.

The problem with playing this stupid numbers game is if they correct the drivers to run the core at
what the bios or the manual core clock are actually set for the cards speed will drop and users
will be going "WTF", these new drivers are ****, they killed my card, they killed my computer,
they burnt down my house, they knocked the moon out of orbit, etc, etc, etc.

It will be interesting to see if FutureMark holds there approval of NV drivers that run the core higher
than it is supposed to be running. They have pulled their approval in the past for that very reason.

Hmmmmmmm
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Uh, that's the dumbest thing I've ever heard. What possible reason would nVidia have for saying their cards are SLOWER than they are? If every single 7800GTX is stable 40Mhz faster than advertised, then they would have just released 40Mhz higher stats.

Please link me to cases in the past where nVidia has understated their stats. It's so ludicrous it'll give me a great laugh.

More likely the bug is in RivaTuner/3DMark. They have been known to misread the speeds on new hardware. The fact the difference is always 41Mhz, and only occurs with the new hardware, pretty much proves that is the case again now.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
Yes, but a registry tweak is necessary to open up that panel. Do a Google search for "Coolbits" or "NVTweak". Those will install the necessary registry entries to see and modify the speed sliders.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
My BFG version reads 1200 on the ram too, with coolbits. It seems my post earlier about BFG and the clock speeds, wasnt so wrong after all. Or is it? Im confused on the subject.

BFG has released a new bios, that is suppoed to fix this problem. .25 is the good one, .22 is the bad one. Sadly, I have .22. And no floppy drive to do a flash, how annoying.