Question Significant gains with extra threads/cores (1080p): myth or reality?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jana519

Senior member
Jul 12, 2014
771
100
106
There's a popular myth that extra threads/cores will result in significant performance gains for gaming at 1080p.

I present to you exhibit A:

85192.png



85193.png


85194.png


Here we see from none other than our own Ian Cutress that in 3 games benchmarked at 1080p, there is less than 2% performance gain realized from 4 additional threads.

Based on this data, it's ridiculous to recommend extra threads/cores for the average PC gamer. We, as a community, need to stop propagating this myth.

I look forward to your vigorous and data based rebuttals.
 

ondma

Platinum Member
Mar 18, 2018
2,721
1,281
136
I long ago gave up on Anand's gaming reviews. The worst in the business. Nobody is saying the numbers for a 1080 are "wrong", they just are just incomplete not relevant to current top end hardware.
 
  • Like
Reactions: CHADBOGA

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136
I long ago gave up on Anand's gaming reviews. The worst in the business. Nobody is saying the numbers for a 1080 are "wrong", they just are just incomplete not relevant to current top end hardware.

It would be nice if they gave the Intels 3200 memory, it's not like anyone with a Z board won't be able to use it and it's not like overclocking where there is some variance as to what is possible.
 

dlerious

Golden Member
Mar 4, 2004
1,786
724
136
I long ago gave up on Anand's gaming reviews. The worst in the business. Nobody is saying the numbers for a 1080 are "wrong", they just are just incomplete not relevant to current top end hardware.
I used to hate when reviewers used the top of the line card when I was buying x70 (or lower) level cards. I wouldn't be surprised if 75% or more of the cards sold are in that range. I usually visit multiple sites when looking at reviews - for everything.
 

dlerious

Golden Member
Mar 4, 2004
1,786
724
136
It would be nice if they gave the Intels 3200 memory, it's not like anyone with a Z board won't be able to use it and it's not like overclocking where there is some variance as to what is possible.
I think that has to do with JEDEC standards. Anything above 2666 on Intel has to use XMP while AMD is anything above 3200.
 

ondma

Platinum Member
Mar 18, 2018
2,721
1,281
136
I used to hate when reviewers used the top of the line card when I was buying x70 (or lower) level cards. I wouldn't be surprised if 75% or more of the cards sold are in that range. I usually visit multiple sites when looking at reviews - for everything.
I understand that, but clearly test sites cant match hardware to every user. Not sure I trust them either anymore, but game.gpu has tables where you can match different cpus and video cards.

In regards to anands tests, I just think if a test site is testing a top of the line, recently released cpu, they should also pair it with a top end current gpu and the fastest ram the system can use.
 

Elfear

Diamond Member
May 30, 2004
7,097
644
126
I understand that, but clearly test sites cant match hardware to every user. Not sure I trust them either anymore, but game.gpu has tables where you can match different cpus and video cards.

In regards to anands tests, I just think if a test site is testing a top of the line, recently released cpu, they should also pair it with a top end current gpu and the fastest ram the system can use.

I agree that it's good to see CPU benchmarks with the GPU bottleneck removed as much as possible. It would be great if a couple different GPUs could be tested however since it's a very small niche of the gaming world that uses a 2080Ti.
 

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
Do we have any statistics on how many gamers run jedec vs xmp?

If available aftermarket DIMMs are any indicator, it's pretty rare for diy builders to use JEDEC timings on their RAM. Most kits you can buy from Newegg (et al) only have JEDEC timings for DDR4-2133 programmed into them. That's been the case with both my DDR4-3733 and DDR4-4400. Everything else is XMP. Maybe prebuilts are different, but I wouldn't count on it. For example, JEDEC DDR4-3200 is something like CAS/CL 20, but the most common DDR4-3200 kits are CAS/CL 16 via XMP. I don't think even OEMs would provide DDR4-3200 with timings that high.
 
  • Like
Reactions: Makaveli

dlerious

Golden Member
Mar 4, 2004
1,786
724
136

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
Should have included BF4 results. That one would be valid i think without argument. I had a weird experience "upgrading" to a i5 8400t i got cheap for a htpc build. Not so weird as surprised when a 8400t with its 3Ghz all core boost still delivered close to a 40fps increase in minimums. The i5 4670 non k before it was 800mhz faster.

Benching my old 4670 against a friends 3770 back in the day with his 1070, i got like a 10fps bump with the 3770.
 

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
I'm using DDR3 2400 11-13-13 on this computer (4790K). I hope the memory makers can tweak things with the 3200 chips the same as they've done with the 2133 chips.

Remember that timings are relative to clockspeed (and DDR3 isn't identical in function to DDR4, so they can't always be compared directly). There are plenty of folks out there with DDR4-3200 14-14-14-28 which is infinitely preferable to DDR3-2400 10-whatever (I used to run DDR3-2400 10-12-13-32). I would say DDR4 has improved considerably on clockspeed, bandwidth, and latency versus DDR3.
 

dlerious

Golden Member
Mar 4, 2004
1,786
724
136
Remember that timings are relative to clockspeed (and DDR3 isn't identical in function to DDR4, so they can't always be compared directly). There are plenty of folks out there with DDR4-3200 14-14-14-28 which is infinitely preferable to DDR3-2400 10-whatever (I used to run DDR3-2400 10-12-13-32). I would say DDR4 has improved considerably on clockspeed, bandwidth, and latency versus DDR3.
My 3200 FlareX is 14-14-14-34. I guess my original post wasn't clear. I didn't intend to imply that the DDR4 3200 chips could only be C22 or higher. I was thinking more along the lines of what the JEDEC standard allowed, you'd need to use XMP settings for anything outside the specs plug and play wise as opposed to manually adjusting the settings in BIOS, if that makes any sense.
 

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
My 3200 FlareX is 14-14-14-34. I guess my original post wasn't clear. I didn't intend to imply that the DDR4 3200 chips could only be C22 or higher. I was thinking more along the lines of what the JEDEC standard allowed, you'd need to use XMP settings for anything outside the specs plug and play wise as opposed to manually adjusting the settings in BIOS, if that makes any sense.

Oh okay. Yeah, JEDEC specs for DDR4-3200 are pretty terrible. Not really sure why JEDEC went so far off the rails when DDR4-3200 CAS/CL16 is trivial to achieve, even with "bad" older Hynix RAM (read: pre-CJR Hynix). DDR4 has come a long way since we first saw it in Skylake systems years ago. JEDEC just doesn't seem to acknowledge that fact.
 

Gideon

Golden Member
Nov 27, 2007
1,637
3,673
136
Oh okay. Yeah, JEDEC specs for DDR4-3200 are pretty terrible. Not really sure why JEDEC went so far off the rails when DDR4-3200 CAS/CL16 is trivial to achieve, even with "bad" older Hynix RAM (read: pre-CJR Hynix). DDR4 has come a long way since we first saw it in Skylake systems years ago. JEDEC just doesn't seem to acknowledge that fact.
It's trivial on 1.35 V JEDEC only specifies 1.2 V

I guess the JEDEC 3200 MHz standard (with 20-22 latency) is mostly meant for servers, where it has more relevance (power draw is important). You'd still want higher MHz, especially on ROME with it's Infinity Fabric, even when the actual memory-latency remains the same.