Question Significant gains with extra threads/cores (1080p): myth or reality?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ZGR

Golden Member
Oct 26, 2012
1,850
266
126
My buddy's R5 3600 is capable of 240hz on R6 Siege and Halo MCC. His previous 4c4t i5 got around 60-80 fps lower in Siege vs my 4c8t.

My 4c8t i7 is simply slower in games at 144hz vs an R5 3600. I can't even hope to hit 200fps in halo and his 6c12t is locked at 240.

I definitely need more cores. anyone who thinks gaming on a 4c4t is fine simply either plays at 60hz or does not play newer games.

Even games that run the AI on one core are benefitting from the larger caches that more cores provide. Not to mention hosting dedicated servers to play with friends.... I get a large performance deficit on a 4c4t/4c8t when I host an ARMA mission with one other person. The 6c12t and up CPU's are not seeing that deficit when hosting. ARMA missions are just 30fps, so even an R5 3600 is a big step up.
 
  • Like
Reactions: guachi

soresu

Golden Member
Dec 19, 2014
1,325
521
136
There's a popular myth that extra threads/cores will result in significant performance gains for gaming at 1080p.

I present to you exhibit A......
Depends on the engine, the backend API (ie DX12/Vulkan), and if you are running other things in the background while playing I guess.

The older API's (DX11/OpenGL) tend to be more affected by lacking single thread performance, whereas DX12 and Vulkan (or at least Vulkan?) were made explicitly with efficient multi threading in mind from their inception, and tend to offer more of a benefit to a higher core count from what I have gleaned.
 

soresu

Golden Member
Dec 19, 2014
1,325
521
136
I might also add that both the AMD and nVidia GPU driver stacks will likely respond differently to multithreading, it would certainly be worth checking a range from both vendors against each other on the same CPU's.
 

dlerious

Senior member
Mar 4, 2004
815
167
116
I note the video has no introduction or explanation of methodology on how results were achieved. It is pure, raw footage gaming footage with minimal production. The software used for FPS/utilization counters isn't named, nor how 5 segments of the same game sequence shot independently could be so in sync
The 5 segments are likely from a built-in benchmark - it's the same sequence every time you run it . Assassins Creed Odyssey has that feature according to this list which goes back to 2004. Apparently it's not something new.

 

kaizer777

Junior Member
Nov 30, 2018
9
9
41
Given the evidence provided for higher 1% and .1% lows and often decent scaling with higher core counts on many game engines, what exactly are we confused about? If you're building a new system for gaming, you should buy what you can afford. If you can afford more than 4 cores, you'll clearly have a better gaming experience on average. If you already have a system, and it does what you want, then keep it.
 

Makaveli

Diamond Member
Feb 8, 2002
4,065
130
106
Lol Popular myth according to who?

And damn you guys and those Tech report graphs that was my 2nd favorite site to this one and now its a joke. :(
 
Last edited:
  • Like
Reactions: lobz

Arkaign

Lifer
Oct 27, 2006
20,516
971
126
One important thing to note is that globally, most PC gaming is not in the AAA big budget sphere, but rather stuff like LoL, WoW, SC2, CS, and indie/basic stuff. Indeed I believe I read that more PC gamers play with integrated graphics than with dGPUs as well (though that info was as of like 2016, not sure if it still holds, but it probably does).

There are damned lies and statistics, or something to that effect. As always, 'it depends'.

It's like the use case for the 9900KS (and yes, it's more than 1080p, lol). I'm a big proponent of it because as of yet, it's the closest I can bring my monitor to the higher refresh rates in the games that I play.

As always, I really REALLY wish people would remember that sites nearly always use 1080p Ultra/Max for testing. Which is ok, but you have to look at it for what it is. For example, I usually get better than 1080 ultra performance even at 3440x1440 Ultrawide, because there are many settings I can live without. I value high refresh, which translates as motion clarity, more than dialing everything to the max on my 2080ti. Hah, besides Q2 RTX, I don't even leave RT enabled. But that logic goes both ways. I live in my own edge case, and what is right for me is not for everyone, and vice versa.

Can 4C/4T work fine for lots of stuff? Yup. Can 4C/4T also be terrible for some? Yup.
 
  • Like
Reactions: scannall

lobz

Golden Member
Feb 10, 2017
1,250
1,245
106
One important thing to note is that globally, most PC gaming is not in the AAA big budget sphere, but rather stuff like LoL, WoW, SC2, CS, and indie/basic stuff. Indeed I believe I read that more PC gamers play with integrated graphics than with dGPUs as well (though that info was as of like 2016, not sure if it still holds, but it probably does).

There are damned lies and statistics, or something to that effect. As always, 'it depends'.

It's like the use case for the 9900KS (and yes, it's more than 1080p, lol). I'm a big proponent of it because as of yet, it's the closest I can bring my monitor to the higher refresh rates in the games that I play.

As always, I really REALLY wish people would remember that sites nearly always use 1080p Ultra/Max for testing. Which is ok, but you have to look at it for what it is. For example, I usually get better than 1080 ultra performance even at 3440x1440 Ultrawide, because there are many settings I can live without. I value high refresh, which translates as motion clarity, more than dialing everything to the max on my 2080ti. Hah, besides Q2 RTX, I don't even leave RT enabled. But that logic goes both ways. I live in my own edge case, and what is right for me is not for everyone, and vice versa.

Can 4C/4T work fine for lots of stuff? Yup. Can 4C/4T also be terrible for some? Yup.
That monitor refresh argument always felt pretty much bogus to me. I'd completely understand it, if for example you could drive your monitor with a 9900KS at 240Hz and at 144Hz with a 3700X or 3900X. Actually pretty huge difference (I play CS since 1999, more than 10 years of that competitively, so I'd notice it immediately).
But 230 fps vs 210? You will never notice that. Same with 130 vs 110. You will NEVER notice that. FPS for FPS's sake is just bragging rights in a hall of fame toplist. Not for your actual gaming experience... If you want bragging rights for having 10-20 higher FPS on a screenshot? By all means the 9900KS your best CPU ever. Is the only thing you do the most horribly optimized software used by professionals ever (photoshop)? By all means the 9900KS is your best CPU ever.
Other than that it's just a desperate search for theoretical niches.
 

rbk123

Senior member
Aug 22, 2006
705
274
136
FPS for FPS's sake is just bragging rights in a hall of fame toplist. Not for your actual gaming experience... If you want bragging rights for having 10-20 higher FPS on a screenshot?
That's all this thread is. Jana's tired of taking the beating from other gamer's with superior CPU's over his, so he had to lash out with this old and tired argument. It's all he's got since he doesn't want to cough up the dough to upgrade to save his pride.

Gamer's are very insecure; it's why the 9900KS sells so well - regardless of how many times Intel has stuck it to them.
 
  • Like
Reactions: Makaveli and lobz

jpiniero

Diamond Member
Oct 1, 2010
7,929
1,209
126
Depends on the engine, the backend API (ie DX12/Vulkan), and if you are running other things in the background while playing I guess.

The older API's (DX11/OpenGL) tend to be more affected by lacking single thread performance, whereas DX12 and Vulkan (or at least Vulkan?) were made explicitly with efficient multi threading in mind from their inception, and tend to offer more of a benefit to a higher core count from what I have gleaned.
The benefit from Vulkan is reliving pressure from the main thread, by better efficiency. There's no better inheriant threading advantage.
 
  • Like
Reactions: lobz

Makaveli

Diamond Member
Feb 8, 2002
4,065
130
106
That's all this thread is. Jana's tired of taking the beating from other gamer's with superior CPU's over his, so he had to lash out with this old and tired argument. It's all he's got since he doesn't want to cough up the dough to upgrade to save his pride.

Gamer's are very insecure; it's why the 9900KS sells so well - regardless of how many times Intel has stuck it to them.
100% accurate dude.

Gaming FPS is the last resort of most intel fan boys, their arguments are no longer about playable frame rate vs non playable.

I've seen many create pages of arguments over a 90vs82 fps difference. So blinded by their allegiance to team blue all logic and common sense goes out the window.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
6,859
158
106
www.teamjuchems.com
I mean, if displayed FPS was all we cared about bring back Crossfire and SLI!

FPS numbers+++

It's nice that there is such a larger focus on wider set of benchmarks these days, and we don't just have those FPS graphs that are in the OP anymore.
 
  • Like
Reactions: Makaveli and lobz

eek2121

Senior member
Aug 2, 2005
445
333
136
I just want to throw in a word of caution to anyone who takes AnandTech's gaming numbers seriously. Don't. They are using a GTX 1080 (non-Ti). They are GPU bottlenecked, and in addition, NVIDIA's drivers make assumptions about certain CPUs they should not. In order for a benchmark to be valid, things like 'multi-threaded optimizations' need to be disabled in the NVIDIA control panel.

I have the utmost respect for AnandTech's reviewers, but until things improve I cannot trust their benchmarks.
 
  • Like
Reactions: CHADBOGA and lobz

Arkaign

Lifer
Oct 27, 2006
20,516
971
126
That monitor refresh argument always felt pretty much bogus to me. I'd completely understand it, if for example you could drive your monitor with a 9900KS at 240Hz and at 144Hz with a 3700X or 3900X. Actually pretty huge difference (I play CS since 1999, more than 10 years of that competitively, so I'd notice it immediately).
But 230 fps vs 210? You will never notice that. Same with 130 vs 110. You will NEVER notice that. FPS for FPS's sake is just bragging rights in a hall of fame toplist. Not for your actual gaming experience... If you want bragging rights for having 10-20 higher FPS on a screenshot? By all means the 9900KS your best CPU ever. Is the only thing you do the most horribly optimized software used by professionals ever (photoshop)? By all means the 9900KS is your best CPU ever.
Other than that it's just a desperate search for theoretical niches.
I'm not sure why the hostility. I noticed the upgrade, both from the 8086k, as well as the 3700X. Now the 3700X is overall superior, and still my recommendation outside of purely gaming setups for numerous reasons, but I'm also not the guy running DDR4 2666 and stock clocks. The difference I see is often bigger than 15-20%, which is definitely something you can feel A/B if you are used to it. Due to regular site benchmarks being maxed out ultra settings, you do see more GPU bottlenecking masking some of the gaps.

Is it really worth it? That's in the eye of the beholder, hell it's honestly all just fancy toys if you're talking about PC gaming anyway.

I never said I wanted 'bragging rights', nor called the 9900KS the best CPU ever. That's just kind of bizarre.
 
  • Like
Reactions: ondma

f2bnp

Member
May 25, 2015
156
93
101
Is this a joke thread? Is the OP trolling? Genuinely asking here, even when presented with evidence, OP refuses to concede for seemingly no reason at all. I'll bite and provide my own thoughts, even if OP is trolling, as others reading will still get proper info.

As has been said and proven many times in this thread, there are many games that will not perform all that well on a 4 core/4 thread processor.
My personal experience going from a Ryzen 3 1200 to a Ryzen 5 3600 and gaming on a 16:10 monitor (1920x1200) was that all games that I play stopped exhibiting stutters and average performance actually improved significantly on an RX 480 8GB. While some, if not a lot, of the blame could be put on the low clocks of the Ryzen 3 1200, as well as the fact that Zen 2 (on which my newly bought 3600 is based on) offers many architectural improvements over Zen 1 and thus is faster clock for clock, there were certain games that did not really benefit all that much even when I clocked my 1200 up to 3.9GHz.
One such game was Battlefield 1, which I play semi regularly and I would usually avoid a couple of maps as they destroyed performance on my end (30, maybe 40FPS on average).
Even Rainbow Six Siege, which I thought should have run fantastically on that 1200, performs a lot better now, I used to get weird stutters every now and then that didn't exactly cause a lot of issues, but now that I'm gaming on the 3600, the difference can certainly be felt.

Many single player games are also becoming very CPU demanding, just check out Red Dead Redemption 2 or Assassin's Creed Odyssey.

Also, the insistence on 1080p is baffling. High End GPUs today are being designed with 1440p and 4K in mind, 1080p should be CPU limited if anything. Wouldn't your argument work slightly better on higher resolutions?
 

rbk123

Senior member
Aug 22, 2006
705
274
136
I never said I wanted 'bragging rights', nor called the 9900KS the best CPU ever. That's just kind of bizarre.
He's just speaking about the majority; there will always be exceptions which you appear to be. The majority are far more like the OP, than you.
But that behavior isn't just limited to Intel gamers. AMD gamers are no different when they end up in the same situation.
 

RetroZombie

Senior member
Nov 5, 2019
464
382
96
This is maybe a little off topic, but since already have given my opinion in my very first post here about this, go there if you want to read it, the resume is the more cores the better, nowhere in the world the system with just 4 cores will be faster and gains performance over time!

But since the discussion change a bit and this is somehow related, I will add this:
Normally games benchmarks testers gives you min frames and averages, nobody cares about maximum frames which are very important for erratic and -Redacted- game play, and also the frames difference between the minimum obtained frames and the maximum frames obtained, of course the difference between those and the average is also very important!

I remember when I had the radeon 9700 and my friends and family had the radeon 9600, 9500, geforce 5600, 5700, … when playing Half Life 2 for example, my system was noticeable producing more frames that the others, but me and them noticed that somehow their game play was better than mine because it was noticeable less erratic game performance, where I was going from 100fps to 30fps or less and noticed, and their system kept a better stable game play experience, even if they were getting less frames than I, but at least the difference between the maximum and the minimums was not has high has my system. I certainly noted that the guys with the 9600XT (or something) the game play looked and felt better.
Let’s say for example my radeon 9700 was at 25fps min and 100fps max, and their 25fps mim and 60 fps max.

So the examples bellow not even with freesync or gsync or whatever tech appears solves the problem. But certainly improves specifically the second example.


Example of what a bad frame rate can be very enjoyable, since there are no slowdowns or excessive speed ups:
1578693803908.png


Example of a nice frame rate but with many slowdowns or excessive speed ups probably less enjoyable than the previous example:
1578693814882.png


Example of good frame rate but only for half a second because the other half system stalled:
1578698283679.png

Profanity is not allowed in the tech forums.

Daveybrat
AT Moderator
 
Last edited by a moderator:

HutchinsonJC

Senior member
Apr 15, 2007
424
155
126
I kinda want to agree that it's silly that it made it 3 pages and I don't think the OP has participated in the last several posts, however, the info and posts that keep coming are all fairly useful for anyone looking to understand the situation better and is just lurking in the shadows. Might be all useful info for anyone coming upon this thread from a google search /shrug
 

dlerious

Senior member
Mar 4, 2004
815
167
116
I just want to throw in a word of caution to anyone who takes AnandTech's gaming numbers seriously. Don't. They are using a GTX 1080 (non-Ti). They are GPU bottlenecked, and in addition, NVIDIA's drivers make assumptions about certain CPUs they should not. In order for a benchmark to be valid, things like 'multi-threaded optimizations' need to be disabled in the NVIDIA control panel.

I have the utmost respect for AnandTech's reviewers, but until things improve I cannot trust their benchmarks.
Are their numbers for a GTX 1080 wrong? Should they only test using the top of the line card?
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
20,325
8,020
136
I kinda want to agree that it's silly that it made it 3 pages and I don't think the OP has participated in the last several posts, however, the info and posts that keep coming are all fairly useful for anyone looking to understand the situation better and is just lurking in the shadows. Might be all useful info for anyone coming upon this thread from a google search /shrug
Yes, it is possible that some may be happy with 4 cores if they play an old games at 60hz, but the public needs to know that, that is a thing of the past.
 

UsandThem

Elite Member
Super Moderator
May 4, 2000
13,070
3,719
146
How the hell this thread with its false premise managed to span 3 pages is beyond me! :oops:
Well, in its defense, it's kind of nice seeing something like this that you don't see very often. It sure beats the millions of the "I'm too [insert random excuse], so build my PC for me" type posts. ;)

Plus, not every post here will make to The Congressional Archives. :p
 
  • Haha
  • Like
Reactions: lobz and Makaveli

lobz

Golden Member
Feb 10, 2017
1,250
1,245
106
I'm not sure why the hostility. I noticed the upgrade, both from the 8086k, as well as the 3700X. Now the 3700X is overall superior, and still my recommendation outside of purely gaming setups for numerous reasons, but I'm also not the guy running DDR4 2666 and stock clocks. The difference I see is often bigger than 15-20%, which is definitely something you can feel A/B if you are used to it. Due to regular site benchmarks being maxed out ultra settings, you do see more GPU bottlenecking masking some of the gaps.

Is it really worth it? That's in the eye of the beholder, hell it's honestly all just fancy toys if you're talking about PC gaming anyway.

I never said I wanted 'bragging rights', nor called the 9900KS the best CPU ever. That's just kind of bizarre.
There is zero hostility towards you from me. Your posts are always reasonable even when I don't agree with some of them. Even now I understand your reasoning, I just think it's invalid. That being said, it's not up to me to decide what the 'truth' is :) I stand 100% by what I've said (please read my post again, I also never said that you wanted bragging right, nor that you thought the 9900KS was the best CPU ever) - but I actually dig your forum presence, I swear I feel no hostility at all :)
 
Last edited:

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
20,325
8,020
136
There is zero hostility towards you from me. Your posts are always reasonable even when I don't agree with some of them. Even now I understand your reasoning, I just think it's invalid. That being said, it's not up to me to decide what the 'truth' is :) I stand 100% by what I've said (please read my post again, I also never said that you wanted bragging right, nor that you thought the 9900KS was the best CPU ever) - but I actually dig your forum presence, I swear I feel no hostility at all :)
This is a little off-topic, but I just wanted to point out to the forum that THIS is the way to disagree with someone.
While this is not moderation (this post) But I would like to say as a moderator, that this is what we need more of here, this is the way to disagree ! With respect !
 

Carfax83

Diamond Member
Nov 1, 2010
5,883
573
126
Well, in its defense, it's kind of nice seeing something like this that you don't see very often. It sure beats the millions of the "I'm too [insert random excuse], so build my PC for me" type posts. ;)

Plus, not every post here will make to The Congressional Archives. :p
I won't deny that it can be an interesting subject to discuss. But after how many years of the current gen consoles and multicore CPUs becoming mainstream and forcing developers to change the way they programmed their software, no one can possibly deny the trend towards increasing parallelism that has occurred at the engine, API, hardware and driver level.

This trend will reach new heights with the next gen of consoles having SMT.
 
  • Like
Reactions: DAPUNISHER and IEC

ASK THE COMMUNITY