Article "How much does the CPU count for gaming?" - The answer is suprising.

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnitaPeterson

Diamond Member
Apr 24, 2001
6,022
561
126
Engadget actually went off the beaten path and pitted a Ryzen 3300x against an i9-10900.
Lo and behold, a quad-core budget CPU holds quite well against a 10-core beast, running on similar specs (MB, RAM, GPU).
The conclusion? " If you’re building a gaming PC, unless you’re aiming for ultra-high framerates over everything else, you may be better off putting that money towards a better GPU. "

 

Gideon

Platinum Member
Nov 27, 2007
2,030
5,035
136
One thing I did really like about the Eurogamer review are the box-plots.

This plot has almost everything: 1% lows, 5% lows, mean-average FPS, 5% highs, 1% highs all with nice on-hover annotations. Only min/max FPS are missing (which could be done with outlier/extermum dots, see here for more on boxplots).

Compared to most sites where you have to look at different bar-charts for each datapoint (AVG 95% low, etc). Which gets confusing quickly, once you're trying to remember more than 2 CPUs at a time.

This chart gives a much bettter indication where each CPU stands (and also covers 5% and 1% highs, which most ignore but i personally still find useful). 95% of the frames are within the rectangle, average is the line in the middle, the lines to the sides are 1% lows/max.

I really hope more review sites use something like that.

HwYSSeq.png
 
  • Like
Reactions: amrnuke

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Looking at the graph above and 9700K outperforming 9900K: is there any review of 10900K with HT disabled? That should improve gaming results even more, and 10C are plenty for everything else that is not handled by that juicy 3950X or Threadripper.
 

DrMrLordX

Lifer
Apr 27, 2000
22,921
12,993
136
Looking at the graph above and 9700K outperforming 9900K: is there any review of 10900K with HT disabled? That should improve gaming results even more, and 10C are plenty for everything else that is not handled by that juicy 3950X or Threadripper.

Haven't seen one yet but it would be nice.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
In the last few years some games like Battlefield and Assassin Creed has been the ones pushing the 4C limits. YET. The exact same game and map runs on a console with 8 joke cores, some of witch are not avalible to the game. This is called being lazy, guys, not a technical improvement.

Having 16T being used below 20% is not that much diferent than having 8T at 50% and it is less thread concurrency overhead.
A game like Assassin Creed that pegs a 4T CPU at 100% is problably because they are launching the exact same number of async/MT tasks that they would do on a 8T CPU and paying the extra concurrency overhead. This is called being Lazy. CDPR did an excellent job with W3, as it runs on 4T cpus perfectly, even on 2C/4T ones it does good, and is not that much difenrent from a AC game.

Now devs are going to do the same but with something that i can actually call a proper 8C CPU, but dont be suprised if games end up with only being able to use about 6 of those, as 1 or 2 cores are always reserved for OS and background tasks. Still I would really not want to be on 4C/8T from 2021 onwards.
 

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
The exact same game and map runs on a console with 8 joke cores, some of witch are not avalible to the game. This is called being lazy, guys, not a technical improvement.

BF-games on console allow less players. I think it's 32 vs 64 on PC. And that has a huge effect on CPU usage.
 

jpiniero

Lifer
Oct 1, 2010
16,816
7,258
136
In the last few years some games like Battlefield and Assassin Creed has been the ones pushing the 4C limits. YET. The exact same game and map runs on a console with 8 joke cores, some of witch are not avalible to the game. This is called being lazy, guys, not a technical improvement.

IIRC, the cross-CCX penalty on the console APUs is extremely high. There are probably some games which don't bother to use more than 4 threads because of that.
 

ondma

Diamond Member
Mar 18, 2018
3,310
1,697
136
In the last few years some games like Battlefield and Assassin Creed has been the ones pushing the 4C limits. YET. The exact same game and map runs on a console with 8 joke cores, some of witch are not avalible to the game. This is called being lazy, guys, not a technical improvement.

Having 16T being used below 20% is not that much diferent than having 8T at 50% and it is less thread concurrency overhead.
A game like Assassin Creed that pegs a 4T CPU at 100% is problably because they are launching the exact same number of async/MT tasks that they would do on a 8T CPU and paying the extra concurrency overhead. This is called being Lazy. CDPR did an excellent job with W3, as it runs on 4T cpus perfectly, even on 2C/4T ones it does good, and is not that much difenrent from a AC game.

Now devs are going to do the same but with something that i can actually call a proper 8C CPU, but dont be suprised if games end up with only being able to use about 6 of those, as 1 or 2 cores are always reserved for OS and background tasks. Still I would really not want to be on 4C/8T from 2021 onwards.

This was not my experience with Witcher 3. I played it with a 1060 6gb and an i5 2320. It stuttered horribly, to the point of almost being unplayable.
With the same gpu, it ran perfectly with an 8700k. Of course that is not a direct comparison, because it was a new platform with more and faster ram, but I would think the stuttering on the old system was caused mainly by the cpu.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
IIRC, the cross-CCX penalty on the console APUs is extremely high. There are probably some games which don't bother to use more than 4 threads because of that.

Well the consoles don't use CCX's like Ryzen has, at least not in the same way. Pretty sure it's just 2 4 core Jaguar CPU's fused together. No IF even if they are point to point. So pretty sure across the board Latency is way up even within one of the modules. Not really an architecture built from the ground up to be interlinked like Zen. It's been pretty evident that the consoles shy away from any real CPU usage whenever possible. They just aren't very good cores no matter how many of them you stack.

One of the big things I am looking forward to is consoles allowing for better AI with the new design. That means we can get back to the days of awesome games like FEAR and getting smart opponents.
 
  • Like
Reactions: Elfear

jpiniero

Lifer
Oct 1, 2010
16,816
7,258
136
Looking at MS, they sure are pushing BC hard. Sony is more willing to do PS5 exclusive games but can't say how many will actually make it to PC.

It's kinda weird, MS has the far more powerful console but no games to take advantage. But it affects PC since games are going to continue to use the XBO as the baseline.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
This was not my experience with Witcher 3. I played it with a 1060 6gb and an i5 2320. It stuttered horribly, to the point of almost being unplayable.
With the same gpu, it ran perfectly with an 8700k. Of course that is not a direct comparison, because it was a new platform with more and faster ram, but I would think the stuttering on the old system was caused mainly by the cpu.

That was definately other issue there, My first run of W3 was on a 2500K+750TI, no issues, some years later, played twice W3 on a 3200G once with the IGP and another on a RX570, not a single issue. And i even did some test runs on G5400 and a 200GE with dgpus, no problems.
 
  • Like
Reactions: dvsv