Weird Furmark [UPDATE: + 3DMark] results while playing around with 4770 GPU clocks

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Hello all,

Continued playing around with my 4770 using CCC. ATI Overdrive in it enables me to set the GPU clock anywhere from 500 to 830 MHz, default clock being 750.

I decided to run the FurMark benchmarking test (using FurMark 1.7) for 60 seconds to see how much performance is gained by OCing from 750 to 830MHz.

I don't know if FurMark just isn't reliable to measure OC effectiveness, but the results are consistent, so I hope someone here can enlighten me why I got these results:

Settings: Resolution: 1024x768 (W) - MSAA: 2X

@750MHz: (Default clock)
Run 1: FPS: min=44 max=75 avg=52
Run 2: FPS: min=44 max=75 avg=52
Run 3: FPS: min=44 max=75 avg=52
Run 4: FPS: min=44 max=76 avg=52
Run 5: FPS: min=44 max=75 avg=52

@830MHz:
Run 1: FPS: min=43 max=74 avg=51
Run 2: FPS: min=44 max=74 avg=51
Run 3: FPS: min=44 max=74 avg=51
Run 4: FPS: min=43 max=74 avg=51
Run 5: FPS: min=43 max=74 avg=51

Look at that, everything is 1 frame lower when overclocked by 80MHz?

Even weirder, I decided to downclock a bit, and the best scores came from 675MHz (which is 75MHz downclock from stock speeds)

Settings: Resolution: 1024x768 (W) - MSAA: 2X
675MHz:
Run 1: FPS: min=46 max=77 avg=54
Run 2: FPS: min=46 max=77 avg=53

I tried the tests again, this time with no MSAA, pretty much got the same hierarchy: 675MHz faster than 750MHz, which in turn is faster than 830MHz...

Settings: Resolution: 1024x768 (W) - MSAA: 0X

675MHz: FPS: min=68 max=126 avg=83

750MHz: FPS: min=66 max=123 avg=81

830MHz: FPS: min=65 max=123 avg=80

I tried going below 675MHz, but the scores did start to go lower from there.

Can anybody tell me why this is the case? It seems rather strange to me.

Thanks.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Thermal throttling?

Some ATI cards downclock under Furmark to limit possible thermal issues, so maybe the HD4770 also does (although I thought it was more a high end card issue).

Try renaming the Furmark .exe?
And monitor your card temps and clock speeds if you can.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Thermal throttling?

Some ATI cards downclock under Furmark to limit possible thermal issues, so maybe the HD4770 also does (although I thought it was more a high end card issue).

Try renaming the Furmark .exe?
And monitor your card temps and clock speeds if you can.

That is the most likely explanation.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Thanks for the responses.

I did monitor the temps and clock speeds, using GPU-Z sensors refreshing in the background while the tests run. Max temps in each run ranges from 77C - 82C according to Furmark. GPU-Z sensors validate it, and according to their Core Clock and Memory Clock sensors, it runs at the specified clocks all throughout each of the test runs. GPU Load (still GPU-Z sensor) is at 99% almost constantly, sometimes going to 98% for an instant.

I've heard about the throttling, but GPU-Z seems to indicate nothing of the sort is happening.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Can you reproduce those strange performance results using another benchmark? See if raising the clocks lowers the fps and lowing the clocks increases the fps in another benchmark.
 
Dec 30, 2004
12,553
2
76
Thanks for the responses.

I did monitor the temps and clock speeds, using GPU-Z sensors refreshing in the background while the tests run. Max temps in each run ranges from 77C - 82C according to Furmark. GPU-Z sensors validate it, and according to their Core Clock and Memory Clock sensors, it runs at the specified clocks all throughout each of the test runs. GPU Load (still GPU-Z sensor) is at 99% almost constantly, sometimes going to 98% for an instant.

I've heard about the throttling, but GPU-Z seems to indicate nothing of the sort is happening.

When Intel throttles thermals (when you get to 95C on a desktop chip), the processor stays at 100% it just doesn't do as much processing. So watching GPU load might not tell you, if it works the same way.

I think thermal throttling is it. Try a benchmarking program, or game or the RE5 benchmark (thats free). Furmark kills GPUs if run too long; I don't like it.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Can you reproduce those strange performance results using another benchmark? See if raising the clocks lowers the fps and lowing the clocks increases the fps in another benchmark.
Was planning on doing so... what other free benchmarking apps can I use? RE5 Benchmark seems to be Windows Vista / 7 only, I'm using XP SP3.

When Intel throttles thermals (when you get to 95C on a desktop chip), the processor stays at 100% it just doesn't do as much processing. So watching GPU load might not tell you, if it works the same way.
Yes, but GPU-Z also reports the clock speeds aside from load, and it did show that all throughout the test, the clock speed remained constant. About the RE5 benchmark, is there a WinXP version I can try out?
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
I tried 3DMark05, free version so I couldn't change any settings.

It's a bit weird, but less so than the FurMark benchies.

First, the 3DMark scores aren't as constant, I noticed. Nothing probably earth-shattering.

I saved the benchies but they aren't with me right now, left them at home. Will have them up when I get back.

What's cleared up is that 675MHz (what used to be the "best" clock setting) is now clearly a loser. I ran several tests for each clock setting, and avg 3DMark scores for it is about 16,020. Not really even near 16,100.

But when it comes to 750MHz (default) and 830MHz (max OC allowed in CCC), the results are hazy. While the 675MHz results are almost constant with only very little variation between tests, the 750MHz and 830MHz scores seem to vary a lot.

For 750MHz, one test scored as high as 16.8K. most tests scored about 16.3 to 16.4K. There were some tests that got 16.2K

For 830MHz, it never got the 16.8K score. It had lows of 16.2K as well, and it averaged 16.4K. The highest it got was about 16.5K, maybe 16.6K or near it.

Far from a decisive 10% overclock scenario I was expecting... I mean, I can't even justify it being really faster... it looks more like a wash with the default clock having a slight advantage...

Any insights? Maybe the OC needs to go beyond just 80MHz for solid gains to be seen consistently?

Thanks.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Quite possibly, didn't even cross my mind. I'm just using an Athlon X2 7750. Would that be enough of a bottleneck for just a 4770 on 3DMark? I know a 7750 is no C2D, but the gpu isn't quite a monster itself, I thought they were pretty much good for each other on my 1280x1024 resolution.

Thanks for the feedback, very much appreciated.