What's the CPU bottleneck for game performance?

migo

Junior Member
Nov 23, 2007
21
0
0
I've found with GPUs, when comparing cards from ATI and nVidia, and between various generations that the difference for performance seems to come from the memory interface. 256-bit seems to be the minimum (I suppose it could just be that the 256-bit interface correlates with another feature that's more important, but at least it seems to be a good predictor of GPU performance).

I haven't figured out something similar for CPUs - back in the Celeron/P2 days it was easy - on die cache was the best, followed by off die and no L2 cache. Cores aren't really useful either, as there are hardly any single core CPUs anyway. Is there something I can look for in CPU specs that are a good indicator of CPU gaming performance?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
I've found with GPUs, when comparing cards from ATI and nVidia, and between various generations that the difference for performance seems to come from the memory interface. 256-bit seems to be the minimum (I suppose it could just be that the 256-bit interface correlates with another feature that's more important, but at least it seems to be a good predictor of GPU performance).

I haven't figured out something similar for CPUs - back in the Celeron/P2 days it was easy - on die cache was the best, followed by off die and no L2 cache. Cores aren't really useful either, as there are hardly any single core CPUs anyway. Is there something I can look for in CPU specs that are a good indicator of CPU gaming performance?
Clock speed is the greatest single indicator. i just did a review on it comparing Core i7-920 vs. Phenom II X4 980 BE that is a continuation of a series that has compared many variables using Athlon II, Phenom II X2/X3/X4, E8600, Q9550S and Core i7 all the way back to E4300.
 
Last edited:

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Clock speed is the greatest single indicator.

Yeah, and we know that all too well form Netburst days. As for your review, I find it somewhat biased. You always seem to diminish Nehalems advantage.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Yeah, and we know that all too well form Netburst days. As for your review, I find it somewhat biased. You always seem to diminish Nehalems advantage.
In what way?

My evaluations are focused narrowly on *practical* PC gaming performance with about 30 games and at resolutions that gamers use - 1680x1050 and above with high details and AA/AF.

Most tech sites do very general reviews with few games, low resolution and medium details to over-emphasize the i7's advantage.

My evaluations conclude that Phenom II - or Penryn - is fine for gaming; as long as you have a capable CPU, the graphics card makes the most difference in PC gaming.
 
Last edited:

migo

Junior Member
Nov 23, 2007
21
0
0
So at the same price, a Phenom II X2 BE that can be easily overclocked would outeperform a Phenom II X4 at the same base clock speed without overclocking?
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
no, clock rate does not trump all, not if the game can make proper use of more than 2 threads, which is just about any modern game whether it be a shooter like Bad Company 2 or an RTS like Civ V or SC2, such games will run like a dog on a dual core at the same settings a quad will run like butter

as a base you really want a least a quad core unless you know you're only going to be playing older games. There are some games that can make use of more than 4 threads, legitimizing 6 core CPUs or even dual CPU motherboards, however these games are still very much in the minority, so right now its still best overall to go for a fast quad than to worry about more than 4 cores.

next most important factor is clock rate followed closely by L2/L3 cache size




Also you really can't take memory interface for GPUs at face value as a measuring stick for video card performance. During the Radeon HD 4800s vs. GeForce GTX 200 days we saw a Radeon's 256bit bus compete with a GeForce's 512bit bus because the Radeon was using memory that could provide near twice the bandwidth per memory chip, and thus the interface didn't need to be as wide in order to provide comparable total bandwidth.
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
no, clock rate does not trump all, not if the game can make proper use of more than 2 threads, which is just about any modern game whether it be a shooter like Bad Company 2 or an RTS like Civ V or SC2, such games will run like a dog on a dual core at the same settings a quad will run like butter
Unfortunately, there still aren't many more games than you mentioned that take practical advantage of Quad-core. After years of quad-core availability, a highly clocked dual core is still sufficient for most games. And in some cases, overclocking makes up the difference.

Of course, this is changing and eventually quad-core will be a necessity; but for now, clockspeed is king - with fast graphics, of course.

So at the same price, a Phenom II X2 BE that can be easily overclocked would outeperform a Phenom II X4 at the same base clock speed without overclocking?
An overclocked Ph II X2 would outperform a slower clocked Ph II X4 where the game does not take advantage of the quad's extra cores.
 

Joseph F

Diamond Member
Jul 12, 2010
3,522
2
0
Also you really can't take memory interface for GPUs at face value as a measuring stick for video card performance. During the Radeon HD 4800s vs. GeForce GTX 200 days we saw a Radeon's 256bit bus compete with a GeForce's 512bit bus because the Radeon was using memory that could provide near twice the bandwidth per memory chip, and thus the interface didn't need to be as wide in order to provide comparable total bandwidth.

Don't forget about the Radeon HD 2900XT that had a 512-bit bus, GDDR4 and was a total turd.
 

Arg Clin

Senior member
Oct 24, 2010
416
0
76
In what way?

My evaluations are focused narrowly on *practical* PC gaming performance with about 30 games and at resolutions that gamers use - 1680x1050 and above with high details and AA/AF.

Most tech sites do very general reviews with few games, low resolution and medium details to over-emphasize the i7's advantage.

My evaluations conclude that Phenom II - or Penryn - is fine for gaming; as long as you have a capable CPU, the graphics card makes the most difference in PC gaming.
Thank you for a great review with a practical approach. I rather often see people make the mistake of getting an overpowered cpu and pairing it with a lesser gpu. Old habits die hard I suppose.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Don't forget about the Radeon HD 2900XT that had a 512-bit bus, GDDR4 and was a total turd.

It wasn't like that, Radeon HD 2900 had plenty of bandwidth it didn't need. 115GB/s was a lot at the time, probably twice as much as it needed.
 

migo

Junior Member
Nov 23, 2007
21
0
0
So best bang for buck would be to get an easily overclockable dual core that's pretty cheap, and then upgrade when an overclockable quad core comes down in price?
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,660
2,517
136
So best bang for buck would be to get an easily overclockable dual core that's pretty cheap, and then upgrade when an overclockable quad core comes down in price?

Yeah, except that that is not actually possible.

From Intel, the only platform currently avaible that will have upgrades is LGA1155, and even that is not certain. On it, the only way to overclock is by raising the multiplier, so you need an unlocked chip to oc, and there are no unlocked dual cores.

AMD is currently really hurting. This might change when BD comes out in June (volume in July?) but there is no guarantee of that, especially for gamers.

Right now, the best bang for the buck by far is i5 2500K -- for low-threaded tasks, it is very nearly (2MB less L3 cache, a few percent or less usually) the very best cpu available for purchase, and it's a little more than $200.

If you cannot afford that (and perhaps could afford an upgrade later), the next best choice is probably a AM3+ MB, a very cheap Athlon 2 dual core (which you should OC a lot), and hoping that when BD comes it's good enough to compete against SNB. But I honestly think this is the crappier choice.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,660
2,517
136
I've found with GPUs, when comparing cards from ATI and nVidia, and between various generations that the difference for performance seems to come from the memory interface. 256-bit seems to be the minimum (I suppose it could just be that the 256-bit interface correlates with another feature that's more important, but at least it seems to be a good predictor of GPU performance).

That's because a wider memory interface considerably increases the per-card cost for the manufacturer. This means they'd rather not make it any wider than they have to, so it can be used as a baseline for evaluating the rest of the features of the card. However, there are of course exceptions, notably 2900XT which frankly had much more memory bandwidth than it had any reason to have, probably mostly because ATi rather overestimated the performance the GPU would have.

Note that the width of the memory interface is not what matters, the total bandwidth is. 256bit interface with 1000MHz GDDR3 (2000MHz effective data rate) is, for most intents and purposes, equivalent to a 128bit interface with 1000MHz GDDR5 (4000MHz effective data rate). That's 64 GB/s bandwidth either way.

In any case, using features to evaluate video card or CPU performance today is a fool's errand -- you cannot hope to be very good at it, and in any case, it's much harder than just opening up Anandtech Bench (or Tom's Hardware's Charts.), which tell you all you need to know of the performance differentials of various parts.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
So best bang for buck would be to get an easily overclockable dual core that's pretty cheap, and then upgrade when an overclockable quad core comes down in price?

Best bang of the buck now? 2500K. 2nd would be PhII 945/955. Both when paired with ~$100 boards.

For gaming it will be foolish to buy a dual-core brand new these days, considering how cheap the 2500K is. Although to be honest, even cpu-intensive, AMD unfriendly games like SC2 on my 3.6GHz dual-core X2 555 BE felt nowhere near slow at all.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Unfortunately, there still aren't many more games than you mentioned that take practical advantage of Quad-core. After years of quad-core availability, a highly clocked dual core is still sufficient for most games. And in some cases, overclocking makes up the difference.

oh I'm sorry, I thought I was crystal clear about the stipulation of "as a base you really want a least a quad core unless you know you're only going to be playing older games"

I've played far too many modern games where I see people complaining about poor performance on their dualcore rigs relative to otherwise identical quadcore systems to perpetuate the myth that dualcore rigs are "good enough" without a disclaimer along the lines of old games only
 

smartpatrol

Senior member
Mar 8, 2006
870
0
0
Clock speed is by no means a good indicator of relative performance. You really need to check out reviews and benchmarks to get a good comparison. E.G the Phenom II 980 at 3.7 GHz is the highest clocked quad-core ever released, and yet it gets trounced by the i5 2500K at 3.3 GHz.

That said, if you're gaming at 1920x1080 or higher with any decent CPU, you are going to be GPU-limited anyway.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
oh I'm sorry, I thought I was crystal clear about the stipulation of "as a base you really want a least a quad core unless you know you're only going to be playing older games"

I've played far too many modern games where I see people complaining about poor performance on their dualcore rigs relative to otherwise identical quadcore systems to perpetuate the myth that dualcore rigs are "good enough" without a disclaimer along the lines of old games only
If you define "older" as games older than this year, i will agree. Otherwise you are perpetuating your own myth. And please *list* the games that play practically better on a quad than a dual-core CPU, everything else being equal.

You most often see "people complaining" about poor performance for other reasons than having a dual core instead of a quad.
:colbert:
Clock speed is by no means a good indicator of relative performance. You really need to check out reviews and benchmarks to get a good comparison. E.G the Phenom II 980 at 3.7 GHz is the highest clocked quad-core ever released, and yet it gets trounced by the i5 2500K at 3.3 GHz.

That said, if you're gaming at 1920x1080 or higher with any decent CPU, you are going to be GPU-limited anyway.
i just proved otherwise using 30 games with a Core i7-920 vs. Phenom II 980 BE with both of them running the same 3.7GHz using a GTX 590 from 1920x1200 to 5760x1080 resolutions. What real gamer would run with a lower resolution just to to show the i5/7 is faster at 1280x720?
:p
 
Last edited:

Joseph F

Diamond Member
Jul 12, 2010
3,522
2
0
It wasn't like that, Radeon HD 2900 had plenty of bandwidth it didn't need. 115GB/s was a lot at the time, probably twice as much as it needed.

You missed the point of my post. What I was trying to say is that you cannot judge GPU performance solely on the width of the memory bus.
 
Last edited: