Does the CPU matter in gaming anymore?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
The forums here have been full of derp for a while now. And all other forums on the internet. Trolling and derping (and nerd rage and shilling) is at an all-time high. It's an epidemic that mods and admins are not immune to either.

Unfortunately I have to agree with you. At one time I could rely on these forums for useful, unbiased advice. Now especially the cpu forum seems to have been taken over by a few posters who are determined show their favorite company to have the best product now matter what.
 

dastral

Member
May 22, 2012
67
0
0
@dastral
ohhhh you are so wrong, you look from wrong angle, BD is so huge fail in gaming....
games need 2-4 cores some see more but they benefit more from ipc then from castrated BD modules

Well BD is better for heavily MT games if you have only 150$ to spend on the CPU (BF3 comes to mind).

No i'm not saying FX is better than i3-2120 for Starcraft but it "might".
Now this makes me think it will not : http://www.xbitlabs.com/articles/cpu/display/amd-fx-8120-6100-4100_6.html#sect0
A FX4170 goes to 4.2/4.3 and is unlocked unlike the i3-2120...
I mean if you manage to OC a little bit, you'll end up with 4.6Ghz vs 3.1Ghz
That's a pretty big difference, which could make up for the lower IPC.
My biggest concern will be the TDP....

Now Sandy (and even moreso Ivy) are much better than BD, but under a few specific conditions, i could see BD being tied or even better.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Just did a few tests with my Ivy Bridge in Guild Wars 2.

System:
3570K
HD4870 512MB Crossfire
8GB DDR3 1600 9-9-9-24
Intel X25m G2 (80gb)

I'm running max settings aside from shadows (which are on high) and I have no AA enabled. These are the settings that I use to play every day, and are very close to the ones autodetect picked (I raised my shadow detail from medium to high).

For the first part, I picked a relatively quiet part of Lion's Arch with a lot of graphical detail and very few players running across the screen. Lion's Arch is a busy district though.

4.4ghz: 43fps
3.6ghz: 43fps
3.0ghz: 43fps
2.6ghz: 42fps
2.2ghz: 37fps

The second test was at Vigil Keep looking out over the plains. Graphical detail was moderate. There were no players around here.

4.4ghz: 65fps
3.6ghz: 65fps
3.0ghz: 65fps
2.6ghz: 65fps
2.2ghz: 56fps

In the third test I went into the Heart of The Mists (pvp staging area) and found a nice spot to stare at the ground. There were around 25 players mostly standing around within viewing distance (if I had not been staring at the ground). My framerate was pretty variable, +/- 5 or so, but I did my best to pick a good average over about 30 seconds. My results:

4.4ghz: 82fps
3.6ghz: 82fps
3.0ghz: 72fps
2.6ghz: 62fps
2.2ghz: 54fps


Interestingly, a 3570K isn't enough not to bottleneck a pair of old HD4870's in a player-busy area until it gets to 3.6ghz. I'm willing to bet there are places in this game where 3.6ghz -> 4.4ghz would provide an improvement. Now, you might say, "well they're all above 60fps except for the 2.2ghz result" but what about large "raid" groups? I've had 50+ players on screen at a given time beating on a monster. How about in PvP where 10 players are onscreen lobbing fireballs at each other and jumping around? I'm sure these are more demanding scenarios, and it suggests that even the best gaming chip you can buy will not keep you above 60fps in all scenarios unless overclocked when graphics aren't your limiting factor (and you can always drop visual detail to make sure they're not).

In my last test, framerate scaled almost linearly with clockspeed. Where does that put a Bulldozer chip, which might have as little as 2/3 the single-threaded performance?

Theoretically, if a Bulldozer chip is 2/3 as fast in games per clock as an Ivy Bridge CPU (hypothetical assumption, I don't know how much slower it is), you would need to be at 5.4ghz before you could maintain 60fps in Heart of The Mists, regardless of GPU.
 
Last edited: