How much does a fast CPU really help in games? pclab.pl 2012-2016 article

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
Too bad, that is the most obvious observation. Hmmm....I guess it all comes down to phrasing. Let me put it this way:

If I had a AMD CPU I would much rather own a Nvidia GPU. AMD+AMD is a bad combo obviously.

While a comparison between AMD and nVidia belongs in VC&G, a comparison of the effect of CPU scaling between them certainly seems in place here. It's pretty well known that the AMD drivers are much more highly dependant on single thread speed, hence the AMD GPUs doing proportionally worse on the FX CPU than nVidia. It would have been interesting to two comparably priced low end systems with OC 8230E (essentially the 9590 result) and i3-6100.
 

Papa Hogan

Senior member
Feb 1, 2011
413
0
71
Very interesting read!

So, would a Sandy Bridge i7 (k model, of course) be a considerable upgrade to an i5 2500k @ 4.5 gHz? There's not a huge difference in performance at this point between different i7's from bridge to bridge, right? $438 for a sandy i7 at amazon is not a little, though it's probably cheaper than a whole new setup w/Skylake. Also, I don't want to delid any cpu to oc it, ever. Looks ridiculously risky for me.

If I went Skylake, it'd be:

motherboard
cpu
16 GB ddr4
(oh, and windows 10 :()

vs.

new cpu only?


Thanks!
 
Last edited:

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
What this discussion seems to leave out for some reason is that what *kind* of game (as well as the underlying code behind the game) you are playing greatly influences how involved the CPU is. As we all know, all games are not created equal.

All four of the games in that chart are primarily single-player games. Games with large-scale multiplayer components and MMOs especially are *much* more heavily reliant on the CPU due to all the netcode involved.

In those games an i5 is still more than adequate when paired with a solid GPU, but the performance gap between, say, an i3 and an i5 is one that's large enough to sway purchase decisions and price/performance metrics.

If I paid $100 more new for a 2600K over the 2500K I still get $80 more when I sell it used with all other things equal. A $20 "loss" in exchange for 4+ years of hyperthreading, big F-ing deal according to "lolol i5 for gaming or GTFO"

This is even more lopsided in favor of the much higher stock clocked 4790K that can work on any Haswell board without OCing.
 

Dasa2

Senior member
Nov 22, 2014
245
29
91
in my experience disabling ht can result in ~5% drop in fps max in a game with decent multithreading

the exception was crysis that has a much larger drop in some cpu limited levels but not others leaving me think that its trying to feed a fixed number of threads depending on the level
when programs try to send more threads of work than the cpu is capable of they seem to take a hit which can make ht look better than it is

so no i dont think swapping from a i5 to a i7 is worth it for gaming
you will get larger gains moving from 1600c9 to 2133c9 ram which is probably still not worth the cost of replacing the existing hardware
 
Last edited:

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
One always gets the i7 when budget is not the issue. If not for the increase in average FPS, then at least for smoothing out the min fps dips.
I haven't seen any evidence that this applies to the 5675C because of its EDRAM.

Does anyone have some?

It even beat the 5775C in some gaming tests because of less thermal throttling presumably.