Does the CPU matter in gaming anymore?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JustMe21

Senior member
Sep 8, 2011
324
49
91
I've seen my i7-860 beat my i5-2500 depite the 2500 having a 500 MHz lead in running the Fritz Chess Benchmark. The i7-860 did have hyperthreading though.

Intel Core i7 860 @ 2.8 GHz w 8GB DDR3
Relative Speed - 22.6
knps - 10634

Intel Core i5 2500 @ 3.3 GHz w 4 GB DDR3
Relative Speed - 21.17
knps - 10161
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
You can simply not compare two benchmarks with one another. Every benchmark is different, especially a game and a synthetic one.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I would argue that since 3D graphics cards have become prevalent, this is about as much as a CPU has ever mattered for gaming.

You have cases where CPUs are critically important, but for gaming, the actually noticeable difference between a "good" CPU and a "great!" CPU is a little more latency in less than 5% of total frames.

Yes, the extreme budget CPUs are not good, but it's always been that the actually noticeable difference between a moderate CPU and high end CPU is pretty negligible for gaming, since the graphics card is usually the primary bottleneck.

I think we are seeing a little bit of a resurgence of people focused on CPU importance for three reasons:

1) game graphics stagnation combined + 1080p stagnation + GPU advancements + post-process AA has led to an "entry level gaming" GPU very capable to display good quality images at 60+ FPS, revealing more CPU bottlenecks than an entry level GPU was able to in the past. Once you move to enthusiast quality cards, it's becoming nearly standard that a $200 card is using 4xaa at the popular res. That never happened before. The CPU bottlenecks are being revealed because the GPUs relative to what's necessary for 30FPS are just way better... and thus more often revealing CPU issues in reviews.

These have probably always been there, and they probably aren't large enough issues for the average enthusiast gamer to even notice, but since it shows up in a review, even if that review is for 2 cards in Crossfire or SLI that are better than what this person has, the psychology has that person knowing there is a constraint at some point. Then the overkill "more is always better" nature of gaming takes over and people buy more than they need.

3) Poorly ported console games tend to require significant CPU.
example: Skyrim before whatever patch it was. Then that patch came out and CPU issues completely disappeared... because they weren't CPU issues, they were software issues.
Also GTA 4, the worst ported (highest CPU dependency) AAA title I know of.

2) 120Hz monitors have gained some traction, and this is a case where CPU bottlenecks in #1 potentially become more real issues. *This is one case where most CPUs are actually inadequate for current software. *Few games can run a full 120Hz, regardless of CPU / GPU combo, largely because of CPU bottlenecks.

I think it will be a LONG time before 120Hz becomes reasonable on mainstream CPUs, just because the the stuff a game has the CPU doing is stuff that you can't really scale well. On the graphics card you can always have features that can be enabled and disabled that only affect visual quality and not the core gameplay: AA, resolution, depth effects, texture quality, etc... However, the stuff the CPU does is AI, motion / direction / physics, tracking objects... things that affect the core of the game. On the GPU side, you can scale visual quality so you can get the same core gaming experience on hardware 1/10th or less the capability of the top tier hardware, but on the CPU side, there are lots of things you can't afford to scale at all without dramatically changing the core gaming experience. You have to develop a game for the average CPU or shut out a huge portion of your potential market.
 
Last edited:

Haserath

Senior member
Sep 12, 2010
793
1
81
.

BF3 is just weird. That a 920@4GHz is so much faster than a 2500K and the 2500K OC giving that large a benefit doesn't make sense at all.

Hyperthreading is beneficial where it is used.

Something seems wrong with the stock results for the 2500k though.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
It always amuses me that AtenRa loves GPU bound benchmarks for BD, but for some reason, he is reporting framerates higher than a monitor can display. Obviously, if one has a 60Hz monitor, nothing over 60fps matters (or 120 for a 120Hz monitor), and thus should not be reported. Presenting an FPS value higher than you will ever see on your monitor is not an accurate picture if we follow his line of thinking.

Perhaps, just perhaps, the goal is to bolster a subpar product, and all this talk of "real world" is just a means to that end, because, real world, any time an FPS is higher than your monitor can display (just as any time your CPU is capable of producing a higher framerate than your GPU), must be meaningless if we buy in to his argument.
 

peonyu

Platinum Member
Mar 12, 2003
2,038
23
81
I dont think its news to anyone that a GPU is what provides the highest fps differences in gaming....I mean, duh.

However CPU's are the next priority in line there. When I upgraded to my current i5-2500k from my older Intel C2D E6400 @ 3ghz, I noticed a decent fps/performance bump for sure. It does make a difference.

It all depends on the game...Play a RTS game and field it with 1000 units and it is your CPU that will cry and not the GPU. Of course most benchmarks feature FPS shooters so its no surprise that GPUs dominate there.
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
It always amuses me that AtenRa loves GPU bound benchmarks for BD, but for some reason, he is reporting framerates higher than a monitor can display. Obviously, if one has a 60Hz monitor, nothing over 60fps matters (or 120 for a 120Hz monitor), and thus should not be reported. Presenting an FPS value higher than you will ever see on your monitor is not an accurate picture if we follow his line of thinking.

Perhaps, just perhaps, the goal is to bolster a subpar product, and all this talk of "real world" is just a means to that end, because, real world, any time an FPS is higher than your monitor can display (just as any time your CPU is capable of producing a higher framerate than your GPU), must be meaningless if we buy in to his argument.

Hm, not really. Higher fps don't give better gameplay in terms of graphical smoothness. But they can give more accurate gameplay in terms of input and timeliness of the graphical representation. This is because the time difference between the input time, the current game (engine) time and the timestamp of the frame rendered is smaller. Unreal Engine 3 games are a good example: The input is much more direct with 120fps vs 60fps, no matter what Hz the display is running on.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It always amuses me that AtenRa loves GPU bound benchmarks for BD, but for some reason, he is reporting framerates higher than a monitor can display. Obviously, if one has a 60Hz monitor, nothing over 60fps matters (or 120 for a 120Hz monitor), and thus should not be reported. Presenting an FPS value higher than you will ever see on your monitor is not an accurate picture if we follow his line of thinking.

Perhaps, just perhaps, the goal is to bolster a subpar product, and all this talk of "real world" is just a means to that end, because, real world, any time an FPS is higher than your monitor can display (just as any time your CPU is capable of producing a higher framerate than your GPU), must be meaningless if we buy in to his argument.

Well... It matters. For the reasons posted above and the fact that when we say 120fps, is that a min? avg? max? There can be a huge delta between an of those. If you're averaging 60fps, your mins will be quite a bit lower.

That said, yes, he does like to spin things to AMD look not as bad as it really is.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Except that it isn't. Not everyone is going to completely saturate their GPU to make it the end all be all limiting factor like you are. I noticed a huge jump from my Q6600 to 3770k in BF3 MP.

Anyone making a blanket statement like this is who shouldn't be listened to.

Then there are the people with multi-GPUs. But I guess since that's not you, it shouldn't be considered right?

ok u have some serious reading comprehension issues. Did I say anything about multi-gpu configurations? I said:

Once u turn on AA & maximize all the quality settings @ 1080+ resolution, ANY quad core will suffice as the workload is now entirely on the GPU.

why on EARTH would u need a multi-gpu setup given the situation i'm referring to above? troll much?

And I said that's my experience with MY 60hz monitor & my monster Gigabyte 670 GTX OC that totally makes my CPU insignificant at the above settings. I, DID, NOT, NOTICE, ANY, DIFFERENCE. It was the WORST upgrade i ever did and it was from listening to tits like u thinking i'd notice a difference. benchmarks don't mean crap to me, if u sit 2 people at those 2 different computers NOBODY would notice a difference. period.

Okay. I am happy that you have proven every single benchmark and review on the planet incorrect. You have proven that AMD's processors are awesome, and clearly superior to Intel's. Anyone that disagrees is an Intel shareholder, shrill, employee, and fanboy. The Intel-CPU cartel will no longer be able to manipulate the public. Instead, the sleeping dragon will awaken, and everyone will be rushing to ebay to buy these awesome quad core processors on ebay since they are so cheap. http://www.newegg.com/Product/Produc...82E16819103244

oh u like sarcasm? me too!!! ok you are obviously a very intelligent guy. And yes i really DO think AMD's CPUs are the best & i fornicate with them every night infront of my intel 2500k that's sitting right beside me because i'm the epitome of fanboys! I do the foreplay with my Nvidia GTX 670 and fantasize its an AMD card cause that's what really gets me off. Obviously nothing gets by you!

Anyways, the game i was referring to was BF3 MP. no difference at all; a X1100T @ 3.8ghz performed the same, and in some instances better, than my 2500k IN MULTIPLAYER so we don't get the same muppets saying "that's only in SP, MP is different". I already said in my post RTS games are different, so if u missed that please learn to read. but in FPS games once the in game quality settings are maxed at any 1080+ resolution CPU doesnt make any noticeable difference at all, and ur precious benchmarks have already proved that.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
No you didn't, I did because you made a blanket statement. like I said, not everyone is going to saturate their GPU to make it that big of a bottleneck. be it single or multi GPU

Furthermore, you originally said not to listen to anyone else and now you're trying to play it off as "my experience" Well my experience tells me you don't know what you're talking about. Go ahead and nerd rage all you want.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Well... It matters. For the reasons posted above and the fact that when we say 120fps, is that a min? avg? max? There can be a huge delta between an of those. If you're averaging 60fps, your mins will be quite a bit lower.

That said, yes, he does like to spin things to AMD look not as bad as it really is.


That's kind of my point. Creating an arbitrary "it doesn't matter past this point and everything above it is really the same" line is pretty shady. ;)
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You've still not updated your Shogun 2 results. In a real battle when watching the action and zooming in, you will never ever have such high fps. The replay doesn't include camera movements, you have to position it yourself to where some action will be - as you would when actually playing the game. Also I found out recently that while fps may be high, animations will be choppy if the CPU has too much work. Do a quick 4 vs 4 battle and watch the units walk/run at the beginning. I have 40-50ish fps, but the guys skip some animation steps because it is simply too much with this many units. Animation and fps seem to be somehow decoupled. You don't learn that from running a graphics benchmark.

The in-game benchmark does include different camera movements. Where you see an fps drop in the bellow graph it is because of a camera closeup to the infantry/cavalry units. The in-game benchmark is a useful tool to bench your system performance in ShoGun 2 and very repeatable.

1024.jpg


As for Civ 5, every review that uses actual savegames gets a different result than you did. HardOCP, Computerbase, look them up. Also I would think it is self-evident to test turn time in that game as well.

Anandtech uses the same benchmark to measure the Civ 5, i havent seen anyone complaining about that so far. Anandtechs results are the same as mine in that test.

BF3 is just weird. That a 920@4GHz is so much faster than a 2500K and the 2500K OC giving that large a benefit doesn't make sense at all. I told you, repeatability is important. Especially to analyze the outcome if something went wrong like here. Calls into question all the BF3 results. To analyze and question results is also part of a test. Don't just run your fraps and post the numbers without thinking if they make sense.

There isnt anything wrong with the BF3 run in the review. I have played the game with the Core i5 2500K for a couple of hours and the results were the same as in the benchmark runs. I have actually run 3-4 more benchmark runs because of the lower performance i had with the 2500K. Each time the results where the same. In smaller/closed maps like Metro the performance was higher, it was only in Caspian Border map that 2500K (default) was slower than the other CPUs of the review.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
That's kind of my point. Creating an arbitrary "it doesn't matter past this point and everything above it is really the same" line is pretty shady. ;)


Well in that case I'm in full agreement. ;)
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
No, it doesn't. Intel is better at gaming. Unless you can prove Techreport wrong by doing your own frame latency testing?

They have only benched 4 games, I on the other hand have benched 9 games all of them in DX-11 mode.

You cant just say Intel is better in gaming in general, not all games are the same.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It always amuses me that AtenRa loves GPU bound benchmarks for BD, but for some reason, he is reporting framerates higher than a monitor can display. Obviously, if one has a 60Hz monitor, nothing over 60fps matters (or 120 for a 120Hz monitor), and thus should not be reported. Presenting an FPS value higher than you will ever see on your monitor is not an accurate picture if we follow his line of thinking.

Perhaps, just perhaps, the goal is to bolster a subpar product, and all this talk of "real world" is just a means to that end, because, real world, any time an FPS is higher than your monitor can display (just as any time your CPU is capable of producing a higher framerate than your GPU), must be meaningless if we buy in to his argument.

If i only wanted to show the best of the FX CPUs, i wouldn't include the following benchmark charts in the review.

1024.jpg


1024.jpg


You are grasping for straws and trying to find something wrong with my review since the first time it was made public.
The review was made to investigate what performance should you expect in DX-11 games with high IQ gaming settings between different CPUs/platforms (when paired with a High-End GPU) and If you will gain any significant performance by OverClocking the CPU.
 

Hypertag

Member
Oct 12, 2011
148
0
0
ok u have some serious reading comprehension issues. Did I say anything about multi-gpu configurations? I said:



why on EARTH would u need a multi-gpu setup given the situation i'm referring to above? troll much?

And I said that's my experience with MY 60hz monitor & my monster Gigabyte 670 GTX OC that totally makes my CPU insignificant at the above settings. I, DID, NOT, NOTICE, ANY, DIFFERENCE. It was the WORST upgrade i ever did and it was from listening to tits like u thinking i'd notice a difference. benchmarks don't mean crap to me, if u sit 2 people at those 2 different computers NOBODY would notice a difference. period.

oh u like sarcasm? me too!!! ok you are obviously a very intelligent guy. And yes i really DO think AMD's CPUs are the best & i fornicate with them every night infront of my intel 2500k that's sitting right beside me because i'm the epitome of fanboys! I do the foreplay with my Nvidia GTX 670 and fantasize its an AMD card cause that's what really gets me off. Obviously nothing gets by you!

Anyways, the game i was referring to was BF3 MP. no difference at all; a X1100T @ 3.8ghz performed the same, and in some instances better, than my 2500k IN MULTIPLAYER so we don't get the same muppets saying "that's only in SP, MP is different". I already said in my post RTS games are different, so if u missed that please learn to read. but in FPS games once the in game quality settings are maxed at any 1080+ resolution CPU doesnt make any noticeable difference at all, and ur precious benchmarks have already proved that.

Why did you keep any of the parts then?

Newegg lets you return all parts with just a 15% restocking fee. Clearly, since the AMD system you have is vastly superior in this one game (in the sense that it gives fewer FPS than the 2500k, but both give over 60FPS). If you are so angered, irritated, and infuriated by the Intel fanboys smearing your AMD system, then why keep it.

You had a 30 day window to return with a restocking fee, and you can still put it right on ebay / craigslist. You can even use the money to buy a FX-8150, a H100, and enjoy the goodness of a 4.5GHz bulldozer (Which is worse than the Phenom II X6, but whatever).

Edit: Okay, the CPU is the only part of a modern system that Newegg doesn't allow an 85% return for. Point still remains. It doesn't take much effort at all to sell a 2500k on ebay.
 
Last edited:

Mallibu

Senior member
Jun 20, 2011
243
0
0
They have only benched 4 games, I on the other hand have benched 9 games all of them in DX-11 mode.

You cant just say Intel is better in gaming in general, not all games are the same.

Intel is better in 99% of the games, and equal with AMD in BF3 MP.
And sorry but their methods of testing and data evaluation > your amd supporting blog :whiste:
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
They have only benched 4 games, I on the other hand have benched 9 games all of them in DX-11 mode.

You cant just say Intel is better in gaming in general, not all games are the same.

Where's the frame latency measurements in your "benchmarks"? Did you ever re-do them when members here pointed out the problems with your benchmarking methodology?

Yes, I can say Intel is better in gaming in general.

You also know that Techreport has benched more than four games in the past because I provided you the link the last time you were posting your blog as fact.

Edit:
Here, I'll give to you again. http://techreport.com/articles.x/22835/6


Let me put it in another way that you may understand...If the CPU doesn't matter, how come professional websites (not blogs) always use Intel CPU's to test video cards? If the CPU doesn't matter then testing with an AMD CPU would make sense.
 
Last edited:

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
Why did you keep any of the parts then?

Newegg lets you return all parts with just a 15% restocking fee. Clearly, since the AMD system you have is vastly superior in this one game (in the sense that it gives fewer FPS than the 2500k, but both give over 60FPS). If you are so angered, irritated, and infuriated by the Intel fanboys smearing your AMD system, then why keep it.

You had a 30 day window to return with a restocking fee, and you can still put it right on ebay / craigslist. You can even use the money to buy a FX-8150, a H100, and enjoy the goodness of a 4.5GHz bulldozer (Which is worse than the Phenom II X6, but whatever).

Edit: Okay, the CPU is the only part of a modern system that Newegg doesn't allow an 85% return for. Point still remains. It doesn't take much effort at all to sell a 2500k on ebay.

i bought it used, so none of the stuff ure talking about applies, and even if i did why would i go through all the trouble to go back to a similar system??? stop being ridiculous.

It was at best a side grade, and the 2500k uses a lot less power, so in that regards it's better. Was it worth the time of finding all the used parts and putting the system together? absolutely not. Hopefully someone will learn from my experience that's contemplating such an "upgrade".

For me, i don't care about Nvidia, AMD, or Intel. I go with whoever is best, the 2500k simply didn't justify the time to buy it and put it together compared to my X6-1100T or any PhenomII for that matter.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Where's the frame latency measurements in your "benchmarks"? Did you ever re-do them when members here pointed out the problems with your benchmarking methodology?

Yes, I can say Intel is better in gaming in general.

You also know that Techreport has benched more than four games in the past because I provided you the link the last time you were posting your blog as fact.

Edit:
Here, I'll give to you again. http://techreport.com/articles.x/22835/6

The above link you gave me only has 5 games and 4 of them are the same with the review we are talking about.

They run Batman AC in DX-9 mode, they run Crysis 2 at VeryHigh and not Ultra that i did and we dont even know if they even used High Res Textures. In Civ 5 they didnt enable any MSAA filters.

So, no matter what you say, their benchmark tests used different(Lower) IQ settings than mine. You will not play that games with low IQ settings if you have an HD7950 or the ASUS HD7970 1GHz model that i have used in my review.

There are games that even at 1080p and AA filters enabled they need a faster CPU like F1 2011, but there are also games that they are GPU limited at those resolutions and those IQ settings. If you use a lower end GPU it will only become more evident.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Let me put it in another way that you may understand...If the CPU doesn't matter, how come professional websites (not blogs) always use Intel CPU's to test video cards? If the CPU doesn't matter then testing with an AMD CPU would make sense.

I have never said that the CPU doesnt play a part, i have always said that each game is different and that some they need a faster CPU while others doesn't. And that exactly what was shown in my review.

I can make a GPU review between two different GPUs like HD7970 vs GTX670 or lower models by using the FX8150 at 4.6GHz and have exactly the same conclusion i would made by using an Intel CPU. The faster card will produce more fps no matter what CPU you use at 1080p higher IQ settings and with AA filters On.