CPU Scaling - 4.0Ghz vs. 4.5Ghz Gaming Benchmarks

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
So what youre saying is, with the exception of 2 games, it makes no difference at all since its drawing faster than the screen can refresh :p
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: RussianSensation
Originally posted by: jaredpace
Many people tend to think a 3.4-3.6Ghz cpu is just fine for gaming- Dual or Quad. This is true if you're running a single 8800GT or slower video card, or if you're bound to resolutions below 1280x1024. The faster CPU the better - and here are the benches to prove it (using the two fastest Gfx cards to test the CPU).

The games tested are all at 1680x1050 resolution...a resolution no one uses with GTX 280 and especially 4870 X2.



:eek:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
Originally posted by: jaredpace
Many people tend to think a 3.4-3.6Ghz cpu is just fine for gaming- Dual or Quad. This is true if you're running a single 8800GT or slower video card, or if you're bound to resolutions below 1280x1024. The faster CPU the better - and here are the benches to prove it (using the two fastest Gfx cards to test the CPU).

The games tested are all at 1680x1050 resolution...a resolution no one uses with GTX 280 and especially 4870 X2. On top of that, in situations where 4.0ghz was sufficient it was already getting really high frames so to say 4.5ghz "improves" gaming is NOT true. In situation where 4.0ghz was not sufficient (i.e below 60 frames), 4.5ghz did NOT make the game any more playable either.

As far as I am concerned these benchmarks are not really meaningful to test the impact of a CPU for high end graphics cards because he chose a pointless resolution (i.e. at least should have used 8/16AA throughout) and did not provide numbers for minimum framerates. Secondly, they fail to show that a faster cpu provides any more real world playability. So I am not sure you can disprove that 3.4ghz isn't sufficient.

OK .. a higher resolution just for you and a lower CPU clock - a HD4870x2 is used with e8600@stock 3.33Ghz and then OC'd to [almost 4.0Ghz] :p
- missed it my 10Mhz - a MB issue, no doubt - it is at stock vcore and temps never rise over 57C!! [my x48 better do 4.3!; or else!!]


CoJ @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF

x2/e8600@3.33Ghz - 20.2/116.6/53.8
x2/e8600@3.99Ghz - 28.8/128.2/65.4


LP @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF

Snow
x2/e8600@3.33Ghz - 45.3/23/64
x2/e8600@3.99Ghz - 46.6/24/64

Cave

x2/e8600@3.33Ghz - 57.6/35/71
x2/e8600@3.99Ghz - 61.6/41/71

HL2/LC @ 19x12 - maxed DX9 in 4xAA/16xAF
x2/e8600@3.33Ghz - 217.69/105/301
x2/e8600@3.99Ghz - 236.50/109/301


FEAR: Persius Mandate @ 19x12 - maxed DX9 inc 16xAF
with SS on - no/AA
x2/e8600@3.33Ghz - 72/140/436
x2/e8600@3.99Ghz - 84/145/498

SS off/4xAA

x2/e8600@3.33Ghz - 36/183/659
x2/e8600@3.99Ghz - 41/192/667

Any more questions about CPU scaling at 19x12?
- CoJ blows your argument to hell :p
-- as do the rest of them .. 3.33Ghz is NOT enough for 4870X2 .. the minimum FPS are evidently very dependent on the CPU speed

i have lots more benches to come - they show the same thing
rose.gif


 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: apoppin
Originally posted by: RussianSensation
Originally posted by: jaredpace
Many people tend to think a 3.4-3.6Ghz cpu is just fine for gaming- Dual or Quad. This is true if you're running a single 8800GT or slower video card, or if you're bound to resolutions below 1280x1024. The faster CPU the better - and here are the benches to prove it (using the two fastest Gfx cards to test the CPU).

The games tested are all at 1680x1050 resolution...a resolution no one uses with GTX 280 and especially 4870 X2. On top of that, in situations where 4.0ghz was sufficient it was already getting really high frames so to say 4.5ghz "improves" gaming is NOT true. In situation where 4.0ghz was not sufficient (i.e below 60 frames), 4.5ghz did NOT make the game any more playable either.

As far as I am concerned these benchmarks are not really meaningful to test the impact of a CPU for high end graphics cards because he chose a pointless resolution (i.e. at least should have used 8/16AA throughout) and did not provide numbers for minimum framerates. Secondly, they fail to show that a faster cpu provides any more real world playability. So I am not sure you can disprove that 3.4ghz isn't sufficient.

OK .. a higher resolution just for you and a lower CPU clock - a HD4870x2 is used with e8600@stock 3.33Ghz and then OC'd to [almost 4.0Ghz] :p
- missed it my 10Mhz - a MB issue, no doubt - it is at stock vcore and temps never rise over 57C!! [my x48 better do 4.3!; or else!!]


CoJ @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF

x2/e8600@3.33Ghz - 20.2/116.6/53.8
x2/e8600@3.99Ghz - 28.8/128.2/65.4


LP @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF

Snow
x2/e8600@3.33Ghz - 45.3/23/64
x2/e8600@3.99Ghz - 46.6/24/64

Cave

x2/e8600@3.33Ghz - 57.6/35/71
x2/e8600@3.99Ghz - 61.6/41/71

HL2/LC @ 19x12 - maxed DX9 in 4xAA/16xAF
x2/e8600@3.33Ghz - 217.69/105/301
x2/e8600@3.99Ghz - 236.50/109/301


FEAR: Persius Mandate @ 19x12 - maxed DX9 inc 16xAF
with SS on - no/AA
x2/e8600@3.33Ghz - 72/140/436
x2/e8600@3.99Ghz - 84/145/498

SS off/4xAA

x2/e8600@3.33Ghz - 36/183/659
x2/e8600@3.99Ghz - 41/192/667

Any more questions about CPU scaling at 19x12?
- CoJ blows your argument to hell :p
-- as do the rest of them .. 3.33Ghz is NOT enough for 4870X2 .. the minimum FPS are evidently very dependent on the CPU speed

i have lots more benches to come - they show the same thing
rose.gif

Well FEAR has very obvious engine issues with a max 20x as fast as the min and the min being so far below the average...

HL2 youre way beyond the refresh rate of any LCD display. You could easily crank up the AA to 8x.

LPs gains are tiny.

CoJ seems to be the only game with real... tangible gains... Fear does see a margainal gain as well.
 

Shimmishim

Elite Member
Feb 19, 2001
7,504
0
76
Great! I was going to say something about the minimum framerate but good to see on these second round of tests you showing these numbers.

The bottleneck will be the minimum framerate. I remember the days of counterstrike where people would throw smoke bombs and it would start to lag your screen and your screen would start jumping and then you'd be dead.

But for most games where you're already getting 50+ minimum... does it really matter if you run at 3.33 or 4.0 ghz? Maybe I'm the wrong person to talk to since I don't game anymore.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Zstream

This is one of the worst arguments I have ever ever heard. Hell even the most red/green fan boys provide a better argument.

In the testing it shows that the faster CPU improves gameplay rather you like to admit it or not. It means that the faster the CPU you will achieve greater FPS.

He also used 4xAA which is one of the most common AA settings and benchmark settings. Finally, last but not least it means we are finally at a point where the CPU is not fast enough to keep up.

Faster framerate doesn't mean better playability in every scenario. There is no difference in playing COD4 at 146 frames or 169 frames. If I put 2 identical systems in front of you, you wouldn't be able to tell me which is running faster without a frame rate counter. Furthermore, unless you are into tearing, you'd be Vsyncing that to start with. Lastly your LCD monitor can't refresh at those rates. Therefore, unless the minimum framerates are below 60 in the former scenario, you have no argument. Hence why I have a problem with him showing average frames but not minimum frames, if the purpose is to try to show improved playability at such high framerates.

On the other hand, when performance improves from 38 frames to 42 frames in Crysis you are eager to imply that 4.5ghz cpu is sufficient or somehow 'better'? Your attempt to dispute my argument provides no logical invalidation of my conclusions since you fail to show how an end user will actually benefit in either scenario. Unless a significant improvement in minimum framerates has occurred, the gaming hasn't improved in any of these benchmarks.

As to my reasoning for higher AA usage, using 8aa/16aa is for the purpose of inducing a significant workload on the graphics cards to see if in "real world" (i.e. with these cards, in situations such as 1920x1200 or 2560x1600) the cpu will provide a difference or not. Since he ran 1680x1050, another way to show increased workload is through increased AA at that level. If you play at 1680x1050 4AA on your HD 4870 X2, then we should just end this discussion right here because there is no point in trying to test this resolution for such a card. Even a single 4870 will experience "improved benchmark numbers" by any C2D CPU with faster clock speed at that resolution.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: apoppin


CoJ @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF

x2/e8600@3.33Ghz - 20.2/116.6/53.8
x2/e8600@3.99Ghz - 28.8/128.2/65.4

This is the only benchmark that shows a serious improvement on the bottom end and an average frame rate of > 60 frames. The other ones already show that 3.33ghz is sufficient to provide great playability. Fear and HL2 benchmarks are particularly pointless. And Lost Planet is not a FPS game in which case there is no perceptible difference between 57 and 60 frames in character movement.

Any more questions about CPU scaling at 19x12?
- CoJ blows your argument to hell :p
-- as do the rest of them .. 3.33Ghz is NOT enough for 4870X2 .. the minimum FPS are evidently very dependent on the CPU speed

i have lots more benches to come - they show the same thing
rose.gif

Of course minimum framerates are dependent on minimum CPU speed. Not denying that. Just don't say you *need* a 4.0-4.5ghz GPU for a 4870 X2. Say you *want* the fastest cpu for 4870 X2 to extract its maximum performance. Right now your 3.33ghz setup is able to provide 95% (except in call of juarez) of the performance of 4870 X2 compared to the 4.0ghz system. Therefore there is no *need*.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
Originally posted by: apoppin


CoJ @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF

x2/e8600@3.33Ghz - 20.2/116.6/53.8
x2/e8600@3.99Ghz - 28.8/128.2/65.4

This is the only benchmark that shows a serious improvement on the bottom end and an average frame rate of > 60 frames. The other ones already show that 3.33ghz is sufficient to provide great playability. Fear and HL2 benchmarks are particularly pointless. And Lost Planet is not a FPS game in which case there is no perceptible difference between 57 and 60 frames in character movement.

Any more questions about CPU scaling at 19x12?
- CoJ blows your argument to hell :p
-- as do the rest of them .. 3.33Ghz is NOT enough for 4870X2 .. the minimum FPS are evidently very dependent on the CPU speed

i have lots more benches to come - they show the same thing
rose.gif

Of course minimum framerates are dependent on minimum CPU speed. Not denying that. Just don't say you *need* a 4.0-4.5ghz GPU for a 4870 X2. Say you *want* the fastest cpu for 4870 X2 to extract its maximum performance. Right now your 3.33ghz setup is able to provide 95% (except in call of juarez) of the performance of 4870 X2 compared to the 4.0ghz system. Therefore there is no *need*.

You sure hold fast to your own dogma, don't you? :p

i *need* a 4.0Ghz for 4870x2 - at 19x12; 3.33Ghz is a big waste of the X2's potential

Let's look at other games .. and focus on playability .. i will post more for your viewing pleasure tonight
rose.gif
 

harpoon84

Golden Member
Jul 16, 2006
1,084
0
0
I look forward to more benches appopin. Based on your results, the only game where gameplay would improve noticeably is CoJ, the rest are already at insane framerates that are indiscernible to most people, with the slight exception to the min framerates in FEAR perhaps.

I understand your point that a 4GHz+ Core 2 is 'optimal' for a 4870X2, its a very powerful GPU that requires the fastest CPUs currently available to achieve best performance.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: RussianSensation
Hence why I have a problem with him showing average frames but not minimum frames

CPU scaling of the Geforce GTX 280
At this rate the min FPS using a 4.5G CPU should be 15 or 16.

Originally posted by: Zstream
In the testing it shows that the faster CPU improves gameplay rather you like to admit it or not. It means that the faster the CPU you will achieve greater FPS.

 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: jaredpace
Originally posted by: RussianSensation
Hence why I have a problem with him showing average frames but not minimum frames

CPU scaling of the Geforce GTX 280
At this rate the min FPS using a 4.5G CPU should be 15 or 16.

Crysis is the most GPU-bound game ever released. I'm pretty sure you don't even need a CPU to play it, as long as you have 4 or more GPU's. What does that have to do with every other game ever released, though?
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81

Unreal III is a pretty standard engine. It scales nicely with a faster cpu.

3.0Ghz
Min:63
Avg:105

4.0Ghz
Min:73
Avg:115

Also a summary of Apoppin's minimum FPS benchmarks:

CoJ
3.33Ghz - 20.2
3.99Ghz - 28.8
LP Snow
3.33Ghz - 23
3.99Ghz - 24
Cave
3.33Ghz - 35
3.99Ghz - 41
HL2/LC
3.33Ghz - 105
3.99Ghz - 109
FEAR
3.33Ghz - 72
3.99Ghz - 84
SS off
3.33Ghz - 36
3.99Ghz - 41
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: jaredpace

3.0Ghz
Min:63
Avg:105

4.0Ghz
Min:73
Avg:115

This all reminds me of today's 4870 1GB vs. 512MB comparison at Anandtech. The numbers do not match the writeup:

[IRace Driver GRID Performance[/i] - "The 512MB 4870 already leads the GTX 280 in this test, but the additional RAM made a huge difference for the 1GB version as well."

Before scrolling down to the graph I was expecting 50-80% performance increase which I consider "huge". To my surprise I saw a not-so-impressive 19% performance boost and still not hitting 60 frames in a racing game. In comparison the HD 4870 X2 is getting a 58% performance advantage over the 1GB 4870 and an 88% performance increase over the 512mb version. THAT is what I consider huge.

I guess to each his own. To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
well, the 4870x2 is almost twice as expensive, the 4870 1GB slightly more expensive...
But yes it wasn't exactly huge.
 

Mr Fox

Senior member
Sep 24, 2006
876
0
76
Originally posted by: MarcVenice
Look, all these benches show is that a faster CPU gets you even more ridiculous framerates. To get everything out of a gtx280 or HD4870X2 an e8400 still suffices, because there's no difference between 80 and 100 fps. If the minimum framerates however would go up from 25 to 35, then you'd have a good point. But I don't think they do.



The sad fact is that while the frame buffer will create the FPS.... The Monitor will only display based upon refresh rate... so the whole exercise is in essence "Jerkin Yur Gherkin"

It is a sad fact that the benchmarkers won't admit... because it becomes hard to sell overkill.

 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Thanks OP, I agree with the thread, even if most dont. In fact, I cant believe so many fail to realise the if a cpu cannot provide enough data to the gpu, then the gpu wont reach its performance level. I saw this way back in the days of NV geforce DDR256.
I certainly dont know what else you can do to help those that cant see it, however, their lose, dont sweat it....
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
Originally posted by: jaredpace

3.0Ghz
Min:63
Avg:105

4.0Ghz
Min:73
Avg:115

This all reminds me of today's 4870 1GB vs. 512MB comparison at Anandtech. The numbers do not match the writeup:

[IRace Driver GRID Performance[/i] - "The 512MB 4870 already leads the GTX 280 in this test, but the additional RAM made a huge difference for the 1GB version as well."

Before scrolling down to the graph I was expecting 50-80% performance increase which I consider "huge". To my surprise I saw a not-so-impressive 19% performance boost and still not hitting 60 frames in a racing game. In comparison the HD 4870 X2 is getting a 58% performance advantage over the 1GB 4870 and an 88% performance increase over the 512mb version. THAT is what I consider huge.

I guess to each his own. To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.

agreed; something doesn't match - i get the same FPS in Grid with my 4870x2 at 19x12 as they do at 25x16 [not to mention i have the later drivers and the faster CPU] :p

Here are a few more HD4870x2/e8600 benches for you to chew on:
:cookie:

Just remember, all benches at 19x12 - with everything as maxed in-game as it can be and with 4xAA/16xAF ... the faster numbers are with my e8600@3.99Ghz over the stock 3.33Ghz; everything else is the same

Ut3: Containment
89/45/233
81/32/188

ET-QW: Salvage

98/65/165
98/64/159

Stalker: Buildings
195.62/17.06/1937.77
182.22/14.02/1631.42

Stalker: Short

173.67/35.28/1648.42
165.71/29.03/1433.11

i got stuck pla .. i mean benching Grid .. for hours .. and i did not get much of my regular benching done

More later .. but so far, i *much* prefer my e8600 at 4.0 Ghz compared to stock and i bet when my x48 MB allows for a further CPU OC, i expect still further performance improvement; and i am wondering how CF X3 will be affected by CPU speed .. i guess i will find out

rose.gif
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: RussianSensation
To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.

I'm willing to bet that no hardware upgrade you've made in your life has given you a 50% performance boost, unless of course you use multi-threaded software, and upgraded to your highly overclocked Q6600 from a non overclocked, slower A64 X2.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: myocardia
Originally posted by: RussianSensation
To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.

I'm willing to bet that no hardware upgrade you've made in your life has given you a 50% performance boost, unless of course you use multi-threaded software, and upgraded to your highly overclocked Q6600 from a non overclocked, slower A64 X2.

Heck, everyone is neglecting the main point:

it is a FREE performance increase
rose.gif


 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: myocardia

I'm willing to bet that no hardware upgrade you've made in your life has given you a 50% performance boost, unless of course you use multi-threaded software, and upgraded to your highly overclocked Q6600 from a non overclocked, slower A64 X2.

1. AXP 1600+ ATI Rage 128 -> P4 2.6ghz @ 3.2ghz with HT Radeon 8500 >>100%

2. P4 2.6@ 3.2ghz Radeon 8500 -> E6400 2.13@ 3.4ghz GeForce 6600 >>100%

3. E6400 2.13@ 3.4ghz GeForce 6600 -> Q6600 @ 3.4ghz GeForce 8800GTS 320 >> 75% in encoding (of course in other tasks almost immaterial), >>100% in gaming

I've only upgraded 3x since 2001.

Obviously if you are upgrading from dual core 3.4 to quad 3.4 the difference will only be realizable in multi-threaded tasks. It only cost me $200 to upgrade both the cpu to a quad in August 2007 and replace 6600 with 8800gts. So to me it was worth it.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I've only upgraded 3x since 2001.

well duh :p

no wonder you are so easily impressed/unimpressed

i've upgraded 3x since 06
rose.gif


as far as i can see it is a down grade going from 2 cores to 4 -for gaming - if you OC your CPU

since you don't, you *attempt to minimize* the CPU's role in Frames per second

we just come from completely different PoVs about HW

i can't agree with you anymore




 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Originally posted by: RussianSensation
Originally posted by: myocardia

I'm willing to bet that no hardware upgrade you've made in your life has given you a 50% performance boost, unless of course you use multi-threaded software, and upgraded to your highly overclocked Q6600 from a non overclocked, slower A64 X2.

1. AXP 1600+ ATI Rage 128 -> P4 2.6ghz @ 3.2ghz with HT Radeon 8500 >>100%

2. P4 2.6@ 3.2ghz Radeon 8500 -> E6400 2.13@ 3.4ghz GeForce 6600 >>100%

3. E6400 2.13@ 3.4ghz GeForce 6600 -> Q6600 @ 3.4ghz GeForce 8800GTS 320 >> 75% in encoding, >>100% in gaming

I've only upgraded 3x since 2001.

In case you hadn't realized it, those were complete system rebuilds. So like I said to begin with, no hardware upgrade has gotten you a 50+% performance boost. edit: My point here is those are multiple upgrades, not a single upgrade. BTW, you should upgrade more often.;)
 

n7

Elite Member
Jan 4, 2004
21,281
4
81
Originally posted by: apoppin
Originally posted by: RussianSensation
Originally posted by: jaredpace

3.0Ghz
Min:63
Avg:105

4.0Ghz
Min:73
Avg:115

This all reminds me of today's 4870 1GB vs. 512MB comparison at Anandtech. The numbers do not match the writeup:

[IRace Driver GRID Performance[/i] - "The 512MB 4870 already leads the GTX 280 in this test, but the additional RAM made a huge difference for the 1GB version as well."

Before scrolling down to the graph I was expecting 50-80% performance increase which I consider "huge". To my surprise I saw a not-so-impressive 19% performance boost and still not hitting 60 frames in a racing game. In comparison the HD 4870 X2 is getting a 58% performance advantage over the 1GB 4870 and an 88% performance increase over the 512mb version. THAT is what I consider huge.

I guess to each his own. To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.

agreed; something doesn't match - i get the same FPS in Grid with my 4870x2 at 19x12 as they do at 25x16 [not to mention i have the later drivers and the faster CPU] :p

Here are a few more HD4870x2/e8600 benches for you to chew on:
:cookie:

Just remember, all benches at 19x12 - with everything as maxed in-game as it can be and with 4xAA/16xAF ... the faster numbers are with my e8600@3.99Ghz over the stock 3.33Ghz; everything else is the same

Ut3: Containment
89/45/233
81/32/188

ET-QW: Salvage

98/65/165
98/64/159

Stalker: Buildings
195.62/17.06/1937.77
182.22/14.02/1631.42

Stalker: Short

173.67/35.28/1648.42
165.71/29.03/1433.11

i got stuck pla .. i mean benching Grid .. for hours .. and i did not get much of my regular benching done

More later .. but so far, i *much* prefer my e8600 at 4.0 Ghz compared to stock and i bet when my x48 MB allows for a further CPU OC, i expect still further performance improvement; and i am wondering how CF X3 will be affected by CPU speed .. i guess i will find out

rose.gif

GRID rox :D

I discovered UT3 loves quads; the higher clocked, the better.

Your numbers show a huge difference in minimum fps, which is exactly why OCing can make a difference.
Of course, not all games will see a big difference, but for those that do, it gives us reason to OC :)