Ok, I had the opportunity to check out the entire thread completely. There are a few concerns I have now after reading through the thread thoroughly.
1) Your numbers do not match what
this web site indicates.
Your numbers have drastic differences - The web review does not show even close to what your numbers do, percentage wise, of course. Now, I am aware that they are using a Quad Core, so I decided to look into the impacts of a Quad Core, versus a Dual Core for 'Far Cry 2' specifically. So here is what I came up with as a defense for using Quad Core Numbers.
A) The Review Used an Core I7. So some might say that can do 8 threads. True, but with Far Cry 2, when HT is enabled, it takes a 5% performance hit found
here.
In Far Cry, at 640x480 it was demonstrated
here that moving from one to two cores resulted in an 80% performance increase. Adding two more cores only gave it another 22% performance increase. So a Quad Core at 1392Mhz would be roughly equivilent to 1700Mhz Core 2, which is only 100Mhz more than your test setup. So why do your tests show such drastic differences and theirs do not? Food for thought.
2) There were many people who posted information contrary your results. Now that alone does not make you wrong, but the fact that you did not really address their posts makes me wonder...
So I thought about it a while and pondered whether it was worth my time to do any of this testing. I decided, you know, perhaps I am wrong. I have been wrong before in the past, and have no problem coming to terms with it. So, I figured I would run my own tests.
The first game I wanted to fire up was F.E.A.R. This is in my HTPC Box, and unfortunately my Plasma is only 1360x768. But, if anything, that favors your position, not mine. I was basically to lazy to move my higher resolution display over to test with what I did last time 1.5 years ago. So, again, 1360x768 is the resolution that I used, which is less stressful on the Video Card than even 1280x1024. Keep that in mind.
F.E.A.R.
All Settings set to Maximum Values in-game.
Drivers 182.50
Resolution 1360x768
nVidia CP is forcing 4X AA, 16X AF, TSAA
GTS250 Clocked at 750/1811/2200
E5200 @ 1.8Ghz
59 Min
113 Avg
248 Max
E5200 @ 3.16Ghz
59 Min
129 Avg
337 Max
Nothing gained on the minimum frames per second, despite increasing clock speed 75% on the CPU. Now, we did gain slightly on the average, but not that much and as far as maximum, well, yes the faster CPU clearly dominates. Another thing to understand is that this is an E5200 that I tested with. That means the performance has been heavily butchered and a X2 of around 2.0 Ghz would perform about what this does at 1.8Ghz.
So, I decided to try another game. Well, I happened to have my Far Cry 2 CD waited for me to install it. So I installed it, and decided to run my tests. I was expected the results you posted, but came up with something quite different.
Far Cry 2
DX10 - Ultra Settings
Drivers 182.50
Resolution 1360x768
nVidia CP is forcing 4X AA, 16X AF, TSAA
GTS250 Clocked at 750/1811/2200
E5200 @ 1.8Ghz
16 Min
24 Avg
38 Max
E5200 @ 3.16Ghz
14 Min
27.5 Avg
46 Max
These results had me spoofed. I had to rerun the test again for both clock speeds, but I came up with the same average. This is rather strange, not sure entirely why this was. But after running it several times back and forth, I have to conclude that a crippled Pentium E5200 at 1.8Ghz is 'ok' for the most part.
A few more comments I want to make.
1) There are in fact games out there where the CPU speed matters. I won't deny this. Crysis, FSX and Lost Planet being three of them at the top of my head.
2) I do not want to invalidate the results that toyota has posted, because I don't know him. It would be pretty crazy for me to call him a liar, or assassinate his character. The only thing I can say is that his results did not line up with mine, or VR-Zone's. Why is that? No clue.
3) This is with a GTS250. I will also say that if I put a beefier card in here (GTX 280) I would likely see a difference between these processors. So I will also concede to that.
4) If I take off AA, AF, and all the other eye candy, I will also agree that the CPU will hinder the graphics card. But, why do we buy these cards if were willing to put up with jagged edges? foliage that shimmers? I buy these cards so I can run extreme AA, AF, TSAA and anything else I can throw at these things.
But after all of this, I feel that Spike said it best in post #2 (Go back and read it if you want to know), because even if the CPU is not allowing the GPU to fully stretch it's legs, that is not a bad thing if overall performance increases with the insertion of a new Video Card.
I don't really argue on internet forums much anymore, if at all, because I found it to be fruitless. So if you are interested in exchanging blows, I am going to leave the ring right now.
Edit ** For some reason the third link does not work through. I may decide to TinyURL it.
Edit 2 ** Ok, I fixed the link using TinyURL