Does the CPU matter in gaming anymore?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
ok there's just no discussion with u & its obviosuly a waste of time.

Go ahead and discuss the benefits of a cpu @ 800x600 resolution and with all the quality settings turned off. I'm sure that's what most people reading this care about.

if you have a good GPU & you're using quality in game settings (which im assuming most here are doing since its an enthusiast site), then the CPU is simply not important. That's all i'm saying, and that's a massive part of the discussion.

I guess the idea of native resolutions with reduced to no AA and advanced lighting like AO disabled are too difficult a concept for you to understand. These are settings that MANY people use in BF3 for example specifically so their GPU is NOT the major bottleneck. So you're right. There is no discussion. Come back when you can grasp this concept and maybe then you'll be at a level to have an actual discussion.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I guess the idea of native resolutions with reduced to no AA and advanced lighting like AO disabled are too difficult a concept for you to understand. These are settings that MANY people use in BF3 for example specifically so their GPU is NOT the major bottleneck. So you're right. There is no discussion. Come back when you can grasp this concept and maybe then you'll be at a level to have an actual discussion.

Im sorry to say it but you buy the HD7950 so you will be able to enable high IQ and AA filters, otherwise you could buy the HD7870 and save money.

They used a high-end GPU(HD7950) for the review, so we expected them to bench at highest IQ settings and AA filters.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Nature of CPU bottlenecks - could it be that the CPU bottleneck is not too bad, so people are more OK with it? Or can the CPU bottleneck be so bad that it causes stuttering etc. where it becomes really irritating?

My point is that if the CPU bottleneck causes you to go from 80 FPS to 65 FPS because you have a nice GPU, people won't be as vocal about complaining about it, whereas a GPU bottleneck might cause more of an impact and say make you drop from 60 FPS to 5 FPS.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
I guess the idea of native resolutions with reduced to no AA and advanced lighting like AO disabled are too difficult a concept for you to understand. These are settings that MANY people use in BF3 for example specifically so their GPU is NOT the major bottleneck. So you're right. There is no discussion. Come back when you can grasp this concept and maybe then you'll be at a level to have an actual discussion.

come back to what discussion? if you play on low settings then sure the CPU is a factor, but once quality & AA is turned on it DOES NOT MATTER. I said this from my FIRST POST HERE, & that's the most important question right there! Even @ high settings with AA turned off BF3 relies on the GPU, its a very GPU bound game.

Anyways, no point in repeating that point over & over, its what i said from the beginning, and from your own admission u don't deny it, u're just trolling and saying "its a limited experience". Yea limited experience, who actually buys a graphics card for gaming right?!? lol get a clue man.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
To sum up:
You need both - dedicated GPU and CPU benchmarks. And the latter one should be done with as little GPU influence as possible.

i never understood why some reviewers do the latter test (CPU test with minimal GPU interference). Is it representative of real world performance? i play @ 1080p with settings maxed or near max, so i'd look at the numbers for tests that reflect that. If you have a low resolution test where the GPU is very limited, how many people actually play like that? It's neat, but not very practical.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
No one plays like that. The point of tests like these is to get repeatable results without having to go into the game, look for a good scene and do the benching by hand - possibly many times over since CPU intensive scenes may be dynamic and unforeseen interference of game events may make it difficult if not impossible to get consistent results. It makes things easier for the reviewer.

I agree though that this is not the best way. The best way would be as described above, but that takes expertise and time - things that most reviewers don't have in the CPU benchmarking department.

Edit:
And what toyota said. It may not occur to most people, but reducing settings to get more fps (if one needs them) is a valid point.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
i never understood why some reviewers do the latter test (CPU test with minimal GPU interference). Is it representative of real world performance? i play @ 1080p with settings maxed or near max, so i'd look at the numbers for tests that reflect that. If you have a low resolution test where the GPU is very limited, how many people actually play like that? It's neat, but not very practical.
the best way to test cpus is to run all high settings except for AA and run at 1280 or 1680. if one cpu can only muster 50 fps and another can get 70 fps then that clearly tells you that if your gpu is weaker and you have to reduce some graphical settings than you can get 60 fps. the cpu that can only get 50 fps means no chance of averaging 60 fps. running both cpus at 1920x1080 with lots of AA would gpu limit you to where you were only getting 50 fps. that tells you NOTHING as many people want to know if they can get 60 fps or maybe even a higher framerate. testing at 1920x1080 with full AA would not let them know if getting over 50 fps would be possible with a faster gpu or reduced settings.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
the best way to test cpus is to run all high settings except for AA and run at 1280 or 1680. if one cpu can only muster 50 fps and another can get 70 fps then that clearly tells you that if your gpu is weaker and you have to reduce some graphical settings than you can get 60 fps. the cpu that can only get 50 fps means no chance of averaging 60 fps. running both cpus at 1920x1080 with lots of AA would gpu limit you to where you were only getting 50 fps. that tells you NOTHING as many people want to know if they can get 60 fps or maybe even a higher framerate. testing at 1920x1080 with full AA would not let them know if getting over 50 fps would be possible with a faster gpu or reduced settings.

Incorrect, if you are GPU limited with 50 fps at 1080p, then lowering your IQ settings will get you 60 fps at 1080p no matter if your CPU can or cannot produce 60 fps at 1280x1024.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Incorrect, if you are GPU limited with 50 fps at 1080p, then lowering your IQ settings will get you 60 fps at 1080p no matter if your CPU can or cannot produce 60 fps at 1280x1024.

I'm not certain about this. I know for a fact this isn't the case in many RTS's. On my Q6600, I might drop to 15fps in Starcraft in certain maps regardless of my graphical settings - I tried changing them in-game to improve performance and didn't get a single frame per second better, even though I'm using a lowly HD4870. With my overclocked Ivy I am genuinely GPU limited in most situations in SC2 now. In places where I had 15fps on a Q6600 + 4870, I might get 45fps on a 3570k + HD4870.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I'm not certain about this. I know for a fact this isn't the case in many RTS's. On my Q6600, I might drop to 15fps in Starcraft in certain maps regardless of my graphical settings - I tried changing them in-game to improve performance and didn't get a single frame per second better, even though I'm using a lowly HD4870. With my overclocked Ivy I am genuinely GPU limited in most situations in SC2 now. In places where I had 15fps on a Q6600 + 4870, I might get 45fps on a 3570k + HD4870.

There are games that Intel CPUs are clearly better, like SC2 and Skyrim, but those are not the only games people play. There are other games that you will be GPU limited and there are games you could be CPU limited. As i have said so many times before, it dependents on the game and settings.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Guild Wars 2 is a good example of another game (that I play) that can be CPU limited at times, even with my older HD4870. I expect there are places that a stock Ivy can only get 40-50fps regardless of GPU or settings. Wandering around in the wilderness my wife's Q6600 is just as smooth as my 3570K, but in town or in WvW PvP they're quite different, even with the same (old) video card.

Civ 5 and Sins of a Solar empire are also games I play that are quite CPU limited.

The only other game I find myself playing lately is Minecraft, which strangely is more GPU limited, even with a 4870.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Incorrect, if you are GPU limited with 50 fps at 1080p, then lowering your IQ settings will get you 60 fps at 1080p no matter if your CPU can or cannot produce 60 fps at 1280x1024.
what? please read everything I said more carefully. if your cpu cannot deliver anymore than 50 fps then that is all you get. of course if the gpu is the only limitation causing you to get 50 fps at 1080 then lowering the res or settings will give you more fps. that was the whole point of what I was saying.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Incorrect, if you are GPU limited with 50 fps at 1080p, then lowering your IQ settings will get you 60 fps at 1080p no matter if your CPU can or cannot produce 60 fps at 1280x1024.

As i have said so many times before, it dependents on the game and settings.

The first quote above does not seem to have the qualifier that it depends on the game and settings. Instead, that quote seems to be an absolute statement about all situations.

It almost as if the two sections contradict each other? Maybe what you mean to say makes sense, but perhaps it came out unclearly or you expected the reader to make assumptions that weren't specifically stated?
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Im sorry to say it but you buy the HD7950 so you will be able to enable high IQ and AA filters, otherwise you could buy the HD7870 and save money.

They used a high-end GPU(HD7950) for the review, so we expected them to bench at highest IQ settings and AA filters.

I'm not sure what you're trying to say. All I'm saying is not everyone is going to play at settings that will make their GPU a severe bottleneck to the point of the CPU not making a difference. Poohbear seems to think there are only two options for PC games. Fully maxed and absolute lowest. You being the AMD sponsor you are go out of your way to find GPU bottlenecked benchmarks and pretend AMD is just as good. It's not.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I read these types of articles and threads from time to time, and think to myself, "Geez, my PhII sucks, I need to upgrade..." But then I launch whatever game I'm into at the time and things run just fine. :/ I don't discount the importance of the CPU, nor do I have a disillusion that my PhII is as fast as an Intel chip, yet it never seems to hamper my gaming. Maybe when I finally do upgrade I'll notice the difference, but things seem to run just fine for me as-is.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
The first quote above does not seem to have the qualifier that it depends on the game and settings. Instead, that quote seems to be an absolute statement about all situations.

It almost as if the two sections contradict each other? Maybe what you mean to say makes sense, but perhaps it came out unclearly or you expected the reader to make assumptions that weren't specifically stated?

Originally Posted by AtenRa
Incorrect, if you are GPU limited with 50 fps at 1080p, then lowering your IQ settings will get you 60 fps at 1080p no matter if your CPU can or cannot produce 60 fps at 1280x1024.

This is only if you are GPU limited.

Originally Posted by AtenRa
As i have said so many times before, it dependents on the game and settings.

This is in general
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I'm not sure what you're trying to say. All I'm saying is not everyone is going to play at settings that will make their GPU a severe bottleneck to the point of the CPU not making a difference. Poohbear seems to think there are only two options for PC games. Fully maxed and absolute lowest. You being the AMD sponsor you are go out of your way to find GPU bottlenecked benchmarks and pretend AMD is just as good. It's not.

How many times i have said that there are games that Intel is better ??? but there are other games that you are GPU limited.

You people never acknowledge the fact that there are games that are GPU limited and having a faster CPU will grand you nothing more.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I actually have acknowledged that if you read my previous posts in this thread alone.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
How many times i have said that there are games that Intel is better ??? but there are other games that you are GPU limited.

You people never acknowledge the fact that there are games that are GPU limited and having a faster CPU will grand you nothing more.

A faster CPU always gets you more. It may not be much, but it will be more.

Especially in reducing some of those long frame times that cause stutter.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
Incorrect, if you are GPU limited with 50 fps at 1080p, then lowering your IQ settings will get you 60 fps at 1080p no matter if your CPU can or cannot produce 60 fps at 1280x1024.

What? No, that is absolutely wrong! You think increasing resolution makes your CPU faster? Because that is exactly what you are saying in the bold part.
If your CPU cannot produce 60 fps at 1280x1024, it will certainly not produce 60fps at 1080p. Even the contrary: As 1080p has a wider field of view, the CPU load is slightly larger, so you would even get less fps than at a 5:4 resolution. What you say could be true only if the IQ settings would affect the CPU. This often not the case. IQ settings have an effect mainly on GPU performance.

Example:
CPU A can produce 53 fps, CPU B 68 fps if the GPU bottleneck is removed (1280x1024). Now you set 1080p and both systems give 45 fps (GPU limit). If you only see this result, it does not tell you what happens when you turn off AA or lower the resolution a bit. If the GPU limit were removed due to no AA/AF or a faster GPU, it would look like this at 1080p:

CPU A: 48 fps (ca. 10% lower than at 5:4)
CPU B: 62 fps (ca. 10% lower than at 5:4)

With system A you gained 3 fps and still cannot maintain 60.
With system B you gained 17 fps and can barely maintain 60.

I hope now it is clear.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
This is only if you are GPU limited.

No, you're wrong. If you're GPU limited to 50fps and you lower IQ, your FPS will increase until it hits its FIRST bottleneck. If the FIRST bottleneck is the CPU after the reduced settings and the CPU cannot do 60 fps you will NOT get 60fps even if the GPU is capable of more.

No wonder you think CPU's don't matter. lol
 

reb0rn

Senior member
Dec 31, 2009
304
104
116
Anyone who tried intel e3300 @4.4Ghz and GTX460 will know what a CPU bottleneck is... and its a bit hard to notice it if you pump the game up, but for fluid gameplay you need good CPU and AMD is not that for "many" new games worth playing (i don`t count crap games clones)
 

dastral

Member
May 22, 2012
67
0
0
You have more or less 3 scenario (for the same budget) .
GPU Bound : The CPU doesn't matter since it's not a limiting factor
Great Multithreading : AMD usually better since they offer more cores.
Regular Game : Intel has more performance per core.

Even if you do consider Aten's benches "unbiased" they do prove this :
BF3 64M=AMD Batman/AVP=Neither Civilisation=Intel

I mean look here, now this will become even more complicated once you include PRICE :
http://www.legionhardware.com/articles_pages/starcraft_ii_wings_of_liberty_beta_performance,6.html
http://www.bit-tech.net/hardware/cpus/2010/08/18/how-many-cpu-cores-does-starcraft-2-use/2
http://www.techspot.com/review/305-starcraft2-performance/page13.html
http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/9

Obviously you want 2-3 cores "AS FAST AS POSSIBLE" for Starcraft2.
Under 130$ : FX4170(4.3Ghz) or i3-2120(3.3Ghz) ?
i3 Architecture has better IPC, but FX is faster and can OC... hard choice ?
Under 230$ : FX8150(3.6) or i5-3570K(3.4) or i5-2500K(3.3)
FX is faster, yet Intel IPC will just humiliate FX IPC at almost similar speeds, but it's more expensive.

So the big question : What CPU for Starcraft 2 ?
If you are on a budget as crazy as it might seem FX-4170 or FX-6200 might be the best choice.
If you are not on a budget ? i5 is much much better.

Does the CPU matter in gaming anymore?
I'm tempted to say "NO" any CPU around 140$ will be enough for "Most Cases" for everyone.
64 Man BF3 or 1080P "everything maxed" Metro/Crysis is not "most cases".
 
Last edited:

reb0rn

Senior member
Dec 31, 2009
304
104
116
@dastral
ohhhh you are so wrong, you look from wrong angle, BD is so huge fail in gaming....
games need 2-4 cores some see more but they benefit more from ipc then from castrated BD modules
 

Dark_Archonis

Member
Sep 19, 2010
88
1
0
forum derped again...

The forums here have been full of derp for a while now. And all other forums on the internet. Trolling and derping (and nerd rage and shilling) is at an all-time high. It's an epidemic that mods and admins are not immune to either.