Originally posted by: RussianSensation
Originally posted by: jaredpace
Many people tend to think a 3.4-3.6Ghz cpu is just fine for gaming- Dual or Quad. This is true if you're running a single 8800GT or slower video card, or if you're bound to resolutions below 1280x1024. The faster CPU the better - and here are the benches to prove it (using the two fastest Gfx cards to test the CPU).
The games tested are all at 1680x1050 resolution...a resolution no one uses with GTX 280 and especially 4870 X2.
Originally posted by: RussianSensation
Originally posted by: jaredpace
Many people tend to think a 3.4-3.6Ghz cpu is just fine for gaming- Dual or Quad. This is true if you're running a single 8800GT or slower video card, or if you're bound to resolutions below 1280x1024. The faster CPU the better - and here are the benches to prove it (using the two fastest Gfx cards to test the CPU).
The games tested are all at 1680x1050 resolution...a resolution no one uses with GTX 280 and especially 4870 X2. On top of that, in situations where 4.0ghz was sufficient it was already getting really high frames so to say 4.5ghz "improves" gaming is NOT true. In situation where 4.0ghz was not sufficient (i.e below 60 frames), 4.5ghz did NOT make the game any more playable either.
As far as I am concerned these benchmarks are not really meaningful to test the impact of a CPU for high end graphics cards because he chose a pointless resolution (i.e. at least should have used 8/16AA throughout) and did not provide numbers for minimum framerates. Secondly, they fail to show that a faster cpu provides any more real world playability. So I am not sure you can disprove that 3.4ghz isn't sufficient.
Originally posted by: apoppin
Originally posted by: RussianSensation
Originally posted by: jaredpace
Many people tend to think a 3.4-3.6Ghz cpu is just fine for gaming- Dual or Quad. This is true if you're running a single 8800GT or slower video card, or if you're bound to resolutions below 1280x1024. The faster CPU the better - and here are the benches to prove it (using the two fastest Gfx cards to test the CPU).
The games tested are all at 1680x1050 resolution...a resolution no one uses with GTX 280 and especially 4870 X2. On top of that, in situations where 4.0ghz was sufficient it was already getting really high frames so to say 4.5ghz "improves" gaming is NOT true. In situation where 4.0ghz was not sufficient (i.e below 60 frames), 4.5ghz did NOT make the game any more playable either.
As far as I am concerned these benchmarks are not really meaningful to test the impact of a CPU for high end graphics cards because he chose a pointless resolution (i.e. at least should have used 8/16AA throughout) and did not provide numbers for minimum framerates. Secondly, they fail to show that a faster cpu provides any more real world playability. So I am not sure you can disprove that 3.4ghz isn't sufficient.
OK .. a higher resolution just for you and a lower CPU clock - a HD4870x2 is used with e8600@stock 3.33Ghz and then OC'd to [almost 4.0Ghz]
- missed it my 10Mhz - a MB issue, no doubt - it is at stock vcore and temps never rise over 57C!! [my x48 better do 4.3!; or else!!]
CoJ @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF
x2/e8600@3.33Ghz - 20.2/116.6/53.8
x2/e8600@3.99Ghz - 28.8/128.2/65.4
LP @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF
Snow
x2/e8600@3.33Ghz - 45.3/23/64
x2/e8600@3.99Ghz - 46.6/24/64
Cave
x2/e8600@3.33Ghz - 57.6/35/71
x2/e8600@3.99Ghz - 61.6/41/71
HL2/LC @ 19x12 - maxed DX9 in 4xAA/16xAF
x2/e8600@3.33Ghz - 217.69/105/301
x2/e8600@3.99Ghz - 236.50/109/301
FEAR: Persius Mandate @ 19x12 - maxed DX9 inc 16xAF
with SS on - no/AA
x2/e8600@3.33Ghz - 72/140/436
x2/e8600@3.99Ghz - 84/145/498
SS off/4xAA
x2/e8600@3.33Ghz - 36/183/659
x2/e8600@3.99Ghz - 41/192/667
Any more questions about CPU scaling at 19x12?
- CoJ blows your argument to hell
-- as do the rest of them .. 3.33Ghz is NOT enough for 4870X2 .. the minimum FPS are evidently very dependent on the CPU speed
i have lots more benches to come - they show the same thing
![]()
Originally posted by: Zstream
This is one of the worst arguments I have ever ever heard. Hell even the most red/green fan boys provide a better argument.
In the testing it shows that the faster CPU improves gameplay rather you like to admit it or not. It means that the faster the CPU you will achieve greater FPS.
He also used 4xAA which is one of the most common AA settings and benchmark settings. Finally, last but not least it means we are finally at a point where the CPU is not fast enough to keep up.
Originally posted by: apoppin
CoJ @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF
x2/e8600@3.33Ghz - 20.2/116.6/53.8
x2/e8600@3.99Ghz - 28.8/128.2/65.4
Any more questions about CPU scaling at 19x12?
- CoJ blows your argument to hell
-- as do the rest of them .. 3.33Ghz is NOT enough for 4870X2 .. the minimum FPS are evidently very dependent on the CPU speed
i have lots more benches to come - they show the same thing
![]()
Originally posted by: RussianSensation
Originally posted by: apoppin
CoJ @ 19x12 Fully maxed DX10 bench inc 4xAA/16xAF
x2/e8600@3.33Ghz - 20.2/116.6/53.8
x2/e8600@3.99Ghz - 28.8/128.2/65.4
This is the only benchmark that shows a serious improvement on the bottom end and an average frame rate of > 60 frames. The other ones already show that 3.33ghz is sufficient to provide great playability. Fear and HL2 benchmarks are particularly pointless. And Lost Planet is not a FPS game in which case there is no perceptible difference between 57 and 60 frames in character movement.
Any more questions about CPU scaling at 19x12?
- CoJ blows your argument to hell
-- as do the rest of them .. 3.33Ghz is NOT enough for 4870X2 .. the minimum FPS are evidently very dependent on the CPU speed
i have lots more benches to come - they show the same thing
![]()
Of course minimum framerates are dependent on minimum CPU speed. Not denying that. Just don't say you *need* a 4.0-4.5ghz GPU for a 4870 X2. Say you *want* the fastest cpu for 4870 X2 to extract its maximum performance. Right now your 3.33ghz setup is able to provide 95% (except in call of juarez) of the performance of 4870 X2 compared to the 4.0ghz system. Therefore there is no *need*.
Originally posted by: RussianSensation
Hence why I have a problem with him showing average frames but not minimum frames
Originally posted by: Zstream
In the testing it shows that the faster CPU improves gameplay rather you like to admit it or not. It means that the faster the CPU you will achieve greater FPS.
Originally posted by: jaredpace
Originally posted by: RussianSensation
Hence why I have a problem with him showing average frames but not minimum frames
CPU scaling of the Geforce GTX 280
At this rate the min FPS using a 4.5G CPU should be 15 or 16.
Originally posted by: jaredpace
3.0Ghz
Min:63
Avg:105
4.0Ghz
Min:73
Avg:115
Originally posted by: MarcVenice
Look, all these benches show is that a faster CPU gets you even more ridiculous framerates. To get everything out of a gtx280 or HD4870X2 an e8400 still suffices, because there's no difference between 80 and 100 fps. If the minimum framerates however would go up from 25 to 35, then you'd have a good point. But I don't think they do.
Originally posted by: RussianSensation
Originally posted by: jaredpace
3.0Ghz
Min:63
Avg:105
4.0Ghz
Min:73
Avg:115
This all reminds me of today's 4870 1GB vs. 512MB comparison at Anandtech. The numbers do not match the writeup:
[IRace Driver GRID Performance[/i] - "The 512MB 4870 already leads the GTX 280 in this test, but the additional RAM made a huge difference for the 1GB version as well."
Before scrolling down to the graph I was expecting 50-80% performance increase which I consider "huge". To my surprise I saw a not-so-impressive 19% performance boost and still not hitting 60 frames in a racing game. In comparison the HD 4870 X2 is getting a 58% performance advantage over the 1GB 4870 and an 88% performance increase over the 512mb version. THAT is what I consider huge.
I guess to each his own. To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.
Originally posted by: RussianSensation
To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.
Originally posted by: myocardia
Originally posted by: RussianSensation
To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.
I'm willing to bet that no hardware upgrade you've made in your life has given you a 50% performance boost, unless of course you use multi-threaded software, and upgraded to your highly overclocked Q6600 from a non overclocked, slower A64 X2.
Originally posted by: myocardia
I'm willing to bet that no hardware upgrade you've made in your life has given you a 50% performance boost, unless of course you use multi-threaded software, and upgraded to your highly overclocked Q6600 from a non overclocked, slower A64 X2.
I've only upgraded 3x since 2001.
Originally posted by: RussianSensation
Originally posted by: myocardia
I'm willing to bet that no hardware upgrade you've made in your life has given you a 50% performance boost, unless of course you use multi-threaded software, and upgraded to your highly overclocked Q6600 from a non overclocked, slower A64 X2.
1. AXP 1600+ ATI Rage 128 -> P4 2.6ghz @ 3.2ghz with HT Radeon 8500 >>100%
2. P4 2.6@ 3.2ghz Radeon 8500 -> E6400 2.13@ 3.4ghz GeForce 6600 >>100%
3. E6400 2.13@ 3.4ghz GeForce 6600 -> Q6600 @ 3.4ghz GeForce 8800GTS 320 >> 75% in encoding, >>100% in gaming
I've only upgraded 3x since 2001.
Originally posted by: apoppin
Originally posted by: RussianSensation
Originally posted by: jaredpace
3.0Ghz
Min:63
Avg:105
4.0Ghz
Min:73
Avg:115
This all reminds me of today's 4870 1GB vs. 512MB comparison at Anandtech. The numbers do not match the writeup:
[IRace Driver GRID Performance[/i] - "The 512MB 4870 already leads the GTX 280 in this test, but the additional RAM made a huge difference for the 1GB version as well."
Before scrolling down to the graph I was expecting 50-80% performance increase which I consider "huge". To my surprise I saw a not-so-impressive 19% performance boost and still not hitting 60 frames in a racing game. In comparison the HD 4870 X2 is getting a 58% performance advantage over the 1GB 4870 and an 88% performance increase over the 512mb version. THAT is what I consider huge.
I guess to each his own. To me a performance improvement isn't impressive unless there is at least a 50% speed difference. Therefore, to me the performance difference of 5-10% between a 4.0ghz or a 4.5ghz is immaterial.
agreed; something doesn't match - i get the same FPS in Grid with my 4870x2 at 19x12 as they do at 25x16 [not to mention i have the later drivers and the faster CPU]
Here are a few more HD4870x2/e8600 benches for you to chew on:
Just remember, all benches at 19x12 - with everything as maxed in-game as it can be and with 4xAA/16xAF ... the faster numbers are with my e8600@3.99Ghz over the stock 3.33Ghz; everything else is the same
Ut3: Containment
89/45/233
81/32/188
ET-QW: Salvage
98/65/165
98/64/159
Stalker: Buildings
195.62/17.06/1937.77
182.22/14.02/1631.42
Stalker: Short
173.67/35.28/1648.42
165.71/29.03/1433.11
i got stuck pla .. i mean benching Grid .. for hours .. and i did not get much of my regular benching done
More later .. but so far, i *much* prefer my e8600 at 4.0 Ghz compared to stock and i bet when my x48 MB allows for a further CPU OC, i expect still further performance improvement; and i am wondering how CF X3 will be affected by CPU speed .. i guess i will find out
![]()
