Originally posted by: jaredpace
http://www.legionhardware.com/document.php?id=717&p=7
F.E.A.R.
e8400 3.0ghz stock - 106 fps
e8400 4.2ghz overclock - 130.4 fps
So you are telling me you can tell the difference between 130 and 106 average frames in FEAR? That's interesting because I know of no monitor that can display 106 or 130 frames at 1920x1200...
Not to mention Medium settings were enabled in the game not High/Ultra with 0AA. This isn't a realistic gaming scenario since if a gamer already has 100+ frames in FEAR, he/she would increase the settings as high as possible to sustain playability.
crysis
3.0ghz 44 fps
4.2ghz 51.3 fps
I'd say you're incorrect
What does that tell us about minimum framerates? 3.0ghz processor could have had min frames of 15 average of 44 and high of 55. The 4.2ghz processor could have had min frames of 16 average of 51 due to larger high of say 70. In both cases the game would be choppy. Even though the frames are higher like you said, does it really make Crysis more playable?
In any event, no AA was used and settings were Medium....
To me, CPU gaming benchmarks are almost always inherently biased since they do not show real world conditions. They simply show how fast one cpu would be if you removed all bottlenecks (such as videocard). Very few websites do proper cpu benchmarking when they test cpus at settings that we actually play games at and compare minimum framerates as well. This is far more important. However, if you find me benchmarks with everything on high with at least a resolution of 1600x1200 with 4AA/16AF in latest games (Crysis, Bioshock, World in Conflict) where it shows that C2D 4.0ghz makes a game more "playable" than C2D 3.0ghz, then I'll believe you.