Question Significant gains with extra threads/cores (1080p): myth or reality?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jana519

Senior member
Jul 12, 2014
771
100
106
There's a popular myth that extra threads/cores will result in significant performance gains for gaming at 1080p.

I present to you exhibit A:

85192.png



85193.png


85194.png


Here we see from none other than our own Ian Cutress that in 3 games benchmarked at 1080p, there is less than 2% performance gain realized from 4 additional threads.

Based on this data, it's ridiculous to recommend extra threads/cores for the average PC gamer. We, as a community, need to stop propagating this myth.

I look forward to your vigorous and data based rebuttals.
 

aleader

Senior member
Oct 28, 2013
502
150
116
Here's a good video from Tech Deals looking at 4c/4t CPUs. It's titled "4 Core Gaming — Dead or Alive ???"

Spoiler Alert: DEAD.

So I should throw out my 4c i5 4670K even though I play modern games (DCS World, Squad, Insurgency Sandstorm, Post Scriptum, Anno 1800, The Division 2, etc.) at 1440p on high settings without any stuttering? To say it's 'dead' is ridiculous and dramatic, not to mention smug and elitist. Now, if all you play is FPS games where you think you need 144 fps...there are some games that do benefit from more cores...maybe?
 

Hitman928

Diamond Member
Apr 15, 2012
5,243
7,791
136
So I should throw out my 4c i5 4670K even though I play modern games (DCS World, Squad, Insurgency Sandstorm, Post Scriptum, Anno 1800, The Division 2, etc.) at 1440p on high settings without any stuttering? To say it's 'dead' is ridiculous and dramatic, not to mention smug and elitist. Now, if all you play is FPS games where you think you need 144 fps...there are some games that do benefit from more cores...maybe?

I agree that dead is too extreme. But to deny that you can see significant, sometimes even massive, gains by moving up from 4 cores with many modern games is absurd.
 

aleader

Senior member
Oct 28, 2013
502
150
116
I agree that dead is too extreme. But to deny that you can see significant, sometimes even massive, gains by moving up from 4 cores with many modern games is absurd.

Maybe, but it's also true (from the benchmarks) that it isn't going to make any difference unless you own a monitor above 75Hz. For the games I play at 1440p, 75 fps and smooth gameplay is plenty good enough.
 

Hitman928

Diamond Member
Apr 15, 2012
5,243
7,791
136
Maybe, but it's also true (from the benchmarks) that it isn't going to make any difference unless you own a monitor above 75Hz. For the games I play at 1440p, 75 fps and smooth gameplay is plenty good enough.

I don't know, those BF5 and AC:O are pretty brutal for the 4 core tests. Lots of pretty rough frame delivery. Is it playable? Sure, but it isn't smooth by any means.

Edit: Yes, games you play will be a factor. Still lots of games made today that play just fine on 4 cores. But in general terms it's hard to say that 4 cores are a good or safe bet in 2020 and beyond.
 
Last edited:
  • Like
Reactions: Elfear

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
Maybe, but it's also true (from the benchmarks) that it isn't going to make any difference unless you own a monitor above 75Hz. For the games I play at 1440p, 75 fps and smooth gameplay is plenty good enough.

Well you should check out Linus's two tests dealing with response time. In multiplay, even with a 60hz monitor, high FPS can really help a player in regards frame freshness and the human response time aspect. In terms of general play I am still not sold on the "smoothness" of going well beyond 60hz. But there is something to minimizing the frame lag between refreshes in frantic multiplayer.
 

jana519

Senior member
Jul 12, 2014
771
100
106



food for thought.

The YouTube video description states user "Testing Games" is using a Ryzen 9 3900x/16GB 3600MHz system with a GTX 2080 Ti. It shows a 5 segment, horizonal split-screen of a handful of PC games running the exact same sequence side-by-side. Each segment contains a real-time core utilization, FPS average, and 0.1% minimum FPS counter. The segments are labeled "4, 6, 8, 10, 12" to indicate cores.

I note the video has no introduction or explanation of methodology on how results were achieved. It is pure, raw footage gaming footage with minimal production. The software used for FPS/utilization counters isn't named, nor how 5 segments of the same game sequence shot independently could be so in sync. Additionally, no footage or testing of any processor other than the 3900x is shown in the video. So, at the outset the video lacks any methodology or comparisons that would lend it additional credence and/or gravity. In my mind at least, it appears highly questionable.

I must be very outdated but I remember years ago content creators and reviewers were held to much higher standards. I guess times have changed though. That this video has 400k views and 6k likes is beyond baffling to me.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
The YouTube video description states user "Testing Games" is using a Ryzen 9 3900x/16GB 3600MHz system with a GTX 2080 Ti. It shows a 5 segment, horizonal split-screen of a handful of PC games running the exact same sequence side-by-side. Each segment contains a real-time core utilization, FPS average, and 0.1% minimum FPS counter. The segments are labeled "4, 6, 8, 10, 12" to indicate cores.

I note the video has no introduction or explanation of methodology on how results were achieved. It is pure, raw footage gaming footage with minimal production. The software used for FPS/utilization counters isn't named, nor how 5 segments of the same game sequence shot independently could be so in sync. Additionally, no footage or testing of any processor other than the 3900x is shown in the video. So, at the outset the video lacks any methodology or comparisons that would lend it additional credence and/or gravity. In my mind at least, it appears highly questionable.

I must be very outdated but I remember years ago content creators and reviewers were held to much higher standards. I guess times have changed though. That this video has 400k views and 6k likes is beyond baffling to me.
Well, if you had looked at the 500 zillion reviews in the past 5 years, instead of your very outdated one, you would see. 4 cores in 2020 only works good at lower resolutions in older games. If thats all you play, then fine. But modern day games can use at least 6-8 cores, better video cards, higher resolutions, higher refresh, etc...
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,131
1,088
136
On my 3570k BF5 was painful to play. Smoothness and frames being dropped regardless of FPS is not considered by benchmarks. Same graphics card with a 3600 and BF5 is buttery smooth with no frame drops. I do not think the 7700K has problems with games as much as an old 3570K. The CPU utilization was never 100% with 3570K and the 3600 barely goes above 50% during BF5 games.

The other thing that crushes 4C processors is multi monitor streaming, youtube, web browsing while gaming. At some point the GPU is what is limiting, not the CPU when you have 6C 12Threaad CPU's or more.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
Well, if you had looked at the 500 zillion reviews in the past 5 years, instead of your very outdated one, you would see. 4 cores in 2020 only works good at lower resolutions in older games. If thats all you play, then fine. But modern day games can use at least 6-8 cores, better video cards, higher resolutions, higher refresh, etc...

What? That's not how it works. 4C4T has big problems, but 6C6T and 4C8T is very much fine... at least on Skylake and friends. Now the stock memory bandwidth does limit performance.
 

lyssword

Diamond Member
Dec 15, 2005
5,761
25
91
Well, if you had looked at the 500 zillion reviews in the past 5 years, instead of your very outdated one, you would see. 4 cores in 2020 only works good at lower resolutions in older games. If thats all you play, then fine. But modern day games can use at least 6-8 cores, better video cards, higher resolutions, higher refresh, etc...
The YouTube video description states user "Testing Games" is using a Ryzen 9 3900x/16GB 3600MHz system with a GTX 2080 Ti. It shows a 5 segment, horizonal split-screen of a handful of PC games running the exact same sequence side-by-side. Each segment contains a real-time core utilization, FPS average, and 0.1% minimum FPS counter. The segments are labeled "4, 6, 8, 10, 12" to indicate cores.

I note the video has no introduction or explanation of methodology on how results were achieved. It is pure, raw footage gaming footage with minimal production. The software used for FPS/utilization counters isn't named, nor how 5 segments of the same game sequence shot independently could be so in sync. Additionally, no footage or testing of any processor other than the 3900x is shown in the video. So, at the outset the video lacks any methodology or comparisons that would lend it additional credence and/or gravity. In my mind at least, it appears highly questionable.

I must be very outdated but I remember years ago content creators and reviewers were held to much higher standards. I guess times have changed though. That this video has 400k views and 6k likes is beyond baffling to me.

The video should have 800k views. Your dumb trolling should get 0 replies. Done.
 
  • Haha
  • Like
Reactions: Elfear and Zstream

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
Right. I'm sure it's just a total coincidence that the end result of this would most likely be people recommending Core i7/i9 CPUs over Zen 2.
You mean just like debunking 1080p gaming as a "corner case" or a "niche" to make Ryzen look better?
 

guachi

Senior member
Nov 16, 2010
761
415
136
To say it's 'dead' is ridiculous and dramatic, not to mention smug and elitist.

You can get a 6c/12t Ryzen 2600 for $120. I don't think $120 is elitist. I mean, the YouTube channel is called "Tech Deals". The guy who runs it is all about doing the best for cheap.

And I admit I overstated DEAD. It's been awhile since I watched it and I rewatched his conclusion. He said "rough" about a dozen times about the 4c/4t CPU. But he also said the games actually ran and that keeping your current CPU costs nothing. Gaming on something better was much more enjoyable and if you can afford something better (like a $120 Ryzen 2600) that you'd really notice the difference.
 

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
Honestly, I'm not even really sure what the argument is here.

If someone already has a quad core CPU in their current build, and it performs everything well enough for the user's needs/uses, then great and keep using it until it doesn't.

Outside of that, people who need to build a new PC and use it going forward, it would make little sense to buy a quad core CPU today (and honestly outside of the low-end products, it's not even an option). Intel's and AMD's mid-range products are 6 core and above. I can't see people building a new PC primarily used for gaming, and suggesting the person go with quad core CPUs like the i3-9100 / Pentium Gold or AMD Ryzen 3.

Anyways, I leave this 7700k "revisit" article, and anyone can feel free to argue discuss it to death if they want to.

https://www.gamersnexus.net/guides/3423-intel-i7-7700k-revisit-benchmark-vs-9700k-2700-9900k
I must be very outdated but I remember years ago content creators and reviewers were held to much higher standards.
That's a two-way street, ya' know? I remember back when posters who made such bold claims were held to higher standards as well. :cool:
 

Iron Woode

Elite Member
Super Moderator
Oct 10, 1999
30,876
12,383
136
The YouTube video description states user "Testing Games" is using a Ryzen 9 3900x/16GB 3600MHz system with a GTX 2080 Ti. It shows a 5 segment, horizonal split-screen of a handful of PC games running the exact same sequence side-by-side. Each segment contains a real-time core utilization, FPS average, and 0.1% minimum FPS counter. The segments are labeled "4, 6, 8, 10, 12" to indicate cores.

I note the video has no introduction or explanation of methodology on how results were achieved. It is pure, raw footage gaming footage with minimal production. The software used for FPS/utilization counters isn't named, nor how 5 segments of the same game sequence shot independently could be so in sync. Additionally, no footage or testing of any processor other than the 3900x is shown in the video. So, at the outset the video lacks any methodology or comparisons that would lend it additional credence and/or gravity. In my mind at least, it appears highly questionable.

I must be very outdated but I remember years ago content creators and reviewers were held to much higher standards. I guess times have changed though. That this video has 400k views and 6k likes is beyond baffling to me.
Let's take this one whiny point at a time:

1. I note the video has no introduction or explanation of methodology on how results were achieved. - It states that in the description

a Ryzen 9 3900x/16GB 3600MHz system with a GTX 2080 Ti. It shows a 5 segment, horizonal split-screen of a handful of PC games running the exact same sequence side-by-side. Each segment contains a real-time core utilization, FPS average, and 0.1% minimum FPS counter. The segments are labeled "4, 6, 8, 10, 12" to indicate cores.

2. It is pure, raw footage gaming footage with minimal production. - it's a video about testing cpu's in games not a Hollywood production. People want the stats not fluff about what type of shirt the guy is wearing.

3. The software used for FPS/utilization counters isn't named - MSI Afterburner

4. nor how 5 segments of the same game sequence shot independently could be so in sync. - that's called video editing.

5. Additionally, no footage or testing of any processor other than the 3900x is shown in the video - Your original post was

'Question - Significant gains with extra threads/cores (1080p): myth or reality?'

There's a popular myth that extra threads/cores will result in significant performance gains for gaming at 1080p.

This thread is about core count versus game performance at 1080P. The video destroys your hypothesis.

5. I must be very outdated but I remember years ago content creators and reviewers were held to much higher standards. - It must be sad to have to resort to criticizing "standards" rather than admitting you are wrong.
 
  • Like
Reactions: Mopetar and f2bnp

jana519

Senior member
Jul 12, 2014
771
100
106
Let's take this one whiny point at a time:

1. I note the video has no introduction or explanation of methodology on how results were achieved. - It states that in the description


2. It is pure, raw footage gaming footage with minimal production. - it's a video about testing cpu's in games not a Hollywood production. People want the stats not fluff about what type of shirt the guy is wearing.


3. The software used for FPS/utilization counters isn't named - MSI Afterburner


4. nor how 5 segments of the same game sequence shot independently could be so in sync. - that's called video editing.

I'll rephrase: the segments show the exact same sequence played, not shot. Don't know how to play through the exact same sequence, 5 times. Most reviewers use benchmarking tools or rough areas of a game.

5. Additionally, no footage or testing of any processor other than the 3900x is shown in the video - Your original post was

This thread is about core count versus game performance at 1080P. The video destroys your hypothesis.

My hypothesis stands if you're using a 1080p 60Hz monitor. Which most people do. But yes, you're right that I was wrong about gaming on a 144Hz monitor. :D
 

Gideon

Golden Member
Nov 27, 2007
1,619
3,645
136
My hypothesis stands if you're using a 1080p 60Hz monitor. Which most people do. But yes, you're right that I was wrong about gaming on a 144Hz monitor. :D

Even on a 60Hz monitor it's not quite that simple. Tech Report has a classic article that provides a very thorough explanation of why FPS isn't a good metric and frame-times are much better. I think someone already posted it, but here's the link again. Here is another, more recent one, explaining why even using min-FPS is not good enough.

The thing is that your average FPS might be above 60, but if some frames take much longer than others you'll still notice very distracting "stutters" and "janks".

With a constant FPS rate pegged at 60 FPS each frame would take exactly 16.7 ns, with 30 FPS it would be 33 ns. But framerate is not constant. You can be easily getting over 60FPS but have tens or hundreds of frames taking longer than 33ns or even 50ns. It will be noticable and it will be distracting.

For instance look at Far Cry 5 results of Tech report 3700x review

All the CPUs are comfortably above 60 FPS:
Far_Cry_5_average_fps.png.webp


yet they are not running perfectly smoothly over > 60 FPS all the time:

Far_Cry_5_time_spent_33.png


this is the amount of time in the benchmark spend below 30 FPS per-cpu. Bear in mind this is in milliseconds, meaning that even 9900K had one spike during the benchmark (oddly 8700K did not). The spike look like this:

Far_Cry_5_frametime_plot-9900K.png


and on slower CPUs there are multiple spikes that are usually longer:
Far_Cry_5_frametime_plot-1800X.png


Each of them would definitely be noticeable on-screen. Bear in mind that this is despite the fact that even 1800x ran over 100FPS on average.

Overall you are more-or-less correct in saying that at 1080p and 60Hz with 4 really fast cores (e.g. i3-8350K and above) is still usually enough, but even then in some games 6+ cores are smoother (linked to Civ 6 results wher even 8350K cant hold 60FPS all the time, yet i5-8400 can). For 144hz, there is no question that a lot of cores are needed
 

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
Ofc its game dependant and you dont upgrade before you feel it - typically the 0.1 to 1% mins. The question is then. What do you upgrade to then?
When my oc 3570k hit 15fps in bf1 mp64 intence clustermess maps, i knew it was time.
At that time no one of the big guys tested bf1 mp64. The problem was single player and usual bm was in no way representative. For reviewers the problem is testing mp in bf1 and get consistent results is impossible.
Computerbase at that time started to do real world mp64 testing on the different maps. There is btw huge difference on those map and the way it was played. That testing showed a 7700k dip below 40fps, something a few youtubers have documented.
Unfortumately as i can tell computerbase have stopped that testing. I guess both because its expensive but also because the results is not in anyway precise.
Bf1 issued 10 threads and could really use all the power of a 6c/8c - to the max. At that time i got a 1700, now i have a 8700k. I game at 60hz, and some games is bad on ryzen 1&2. Like a 7700k tanked in bf1/5.
I think with ryzen 3600 and up, you can have your cake and eat it. The many cores make sure you will not be throughput limited in the forseable future, and heck a 3600 on b450 does perfectly fine. Same as 8700k to me for 99% of the gaming need. Just cheaper. And a 3700x seems pretty futureproof to me, lean and properly cheap. I would go for that for nearly all situations.
The situation for gamers have never been more simple and cheaper.
 

HutchinsonJC

Senior member
Apr 15, 2007
465
202
126
My hypothesis stands if you're using a 1080p 60Hz monitor.

A game's playability is subjective. If I wanted to play a game and I was ok with 24fps to get through it, or if I was ok with 1024x768 does it change objective measurements that say it's worse quality? I mean what are you trying to prove, really? Look at frame times and the content people have already pointed you to. FPS isn't the whole story.

And if you're going to place yourself in a bubble of out dated software, of course you'll find yourself with a decreased need of upgraded hardware. Games didn't stop with Grand Theft Auto V (2013), Grid Autosport (2014), Shadow of Mordor (2014) all of which your whole argument seems to be contingent upon and specifically pointed out in your original post.

And if you think Play Station 4 (2013) and XBox One (2013) both having 8 cores didn't set into motion a push for taking advantage of more cores in gaming in the years following, you're crazy.
 

Roger Wilco

Diamond Member
Mar 20, 2017
3,868
5,709
136
Maybe confirmation bias. I'm looking for evidence to disprove my assertion, and there's been scant presented so far. I presented 3 popular AAA PC titles with the resolution and average FPS in the image as clear as day, for all to see. One of those games (GTA V) still has tens, if not hundreds, of thousands of players today.



No game title, no resolution shown. It could be a AAA title in 1080p, it could be a Chinese MMORPG in 4K for all I know.

You can call me old-fashioned, but I kind of liked the days when data and results were presented in a straightforward, clear format. Less clickbaity headlines, more substance and content. Not a fan of flash in the pan YouTube reviewers that are here one year and gone the next like so many media socialites on Twitter and Instagram. Give me good old-fashioned websites that have some actual professional journalism.

Your assertion is based on three six-year-old games, all of which were initially built for the PS3/XBX360. Why on earth would you "assert" anything with this? What exactly are we disproving? WoW is a popular AAA PC title with tens, if not hundreds, of thousands of players today, and it will run flawlessly on 2C/4T, and it is completely irrelevant to your assertion.

The youtube videos provided to you are done by highly regarded channels that have existed for years; many of these channels are some of the best sources of objective tech journalism. If you really have some strange aversion to youtube, just compare the 6600K to the 2700X or something on the Anandtech bench. Why do we need to do this for you when it's so plainly obvious?

This is not difficult.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,628
158
106
Do people even know how they test the games?
The games are tested with a fresh windows install with minimal programs running and if the internet can be turned off even better.

Just listen for example to the full nerd podcast about the 3950X where he is saying how he turns off the rgb app because it reduces benchmark numbers.

Is that how people run their machines?

Now they do it for consistency - you can't get windows update or a anti-virus firing up a scan/update on the background on every run of a benchmark at the exact same spot.

So if your CPU resources are pegged at maximum and something else on your machine shows activity your game experience will suffer.

Or you can have a 6c/12t for less than $100 and not notice anything.

Reviews are the best case scenario for lower thread counts.

Now you can just look at something like
where the 4690K and 4790K are tested in newer games.

The problem generally is not the average but the micro stutters.
 
Last edited:

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
Yeaa. I remember to upgrade a core 2 duo laptop with a new penryn 12MB cache cpu to get better mins in tf2. I think from 1.5 to 2.5 GHz. Lol. I risked the entire machine as i wasnt sure the frontside bus would handle it. Crazy what you do to stay compettitive in those tight fights. Worked brilliantly and saved the gameplay. Going from 25 to 45fps for the 1% lows is imo a vital step. Those mins always comes when you need them least :)

Tf2 was damn fun. Now i play some ow. Pretty fun when people stay positive.