Article "How much does the CPU count for gaming?" - The answer is suprising.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnitaPeterson

Diamond Member
Apr 24, 2001
6,022
561
126
Engadget actually went off the beaten path and pitted a Ryzen 3300x against an i9-10900.
Lo and behold, a quad-core budget CPU holds quite well against a 10-core beast, running on similar specs (MB, RAM, GPU).
The conclusion? " If you’re building a gaming PC, unless you’re aiming for ultra-high framerates over everything else, you may be better off putting that money towards a better GPU. "

 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,024
32,487
146
I subscribe to both You Tubers. I agree with pretty much everything you wrote in this post. It bugs me when realistic hardware configs aren't tested or at least discussed in reviews.

Too many you tube reviewers over focus on bench-marking. Don't get me wrong , I love Gamer's Nexus and Hardware Unboxed as much as the next guy, but it's nice to have someone who shares the actual user experience that doesn't always show up on charts.

The only thing that bugs me about Tech Deals is that he doesn't have a clue about motherboard power delivery and is obsessed with putting 32gb of RAM in everything. His awkwardness in live streams makes me cringe sometimes too. He seems like a nice guy with nice family though. I hope his wife's PC knowledge gets better. His streams would be better if someone was able to challenge some of his takes.
Tech is mostly for n00bs. I do not watch most of his content. When he does the gameplay vids, like the breakpoint one, is when I watch. He will put 60hrs into the game, then tell you that the 3300x plays it well enough, but not as well as he likes. That is the pearl I am looking for. Sure, frame time analysis is great, you can see when there are issues, but, the commentary is what cements it for me. When he says it is buttery smooth on the next chip up aka 3600, and spend the extra cash for it if you can find it, that is good advice. If you can't, or that money is earmarked for better GPU, cool. But if I stopped at some of the reviews I read on the 3300x, it looks like it is a better deal than the 3600, based on how they test. Not cool.
 
  • Like
Reactions: Rigg

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Eurogamer has great review of 10900K, they were using 3600CL16 memory for both AMD and Intel and 2080TI.

Gaming results are here.

I think results are eye opening. Even @1440P Intel has ~20% advantage over AMD best and 4C CPU like 3300X look horribad in some games ( clearly 4C even in forum darling config do not work anymore in high end gaming ). 20% gap between AMD and Intel is like paying for 2080TI but getting performance 2080 instead.

Todays high end gaming will be GTX 4060 after 2 gens, but obviously current 3950x owners will sport 5950x ZEN4 and 4080TI?
 

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
Eurogamer has great review of 10900K, they were using 3600CL16 memory for both AMD and Intel and 2080TI.

Gaming results are here.

What crap site is this? it forces you to turn on ad-tracker cookies to see the the actual content,eg the charts. Which I won't do out of principle. Not sure if this is actually allowed according to gdpr.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
What crap site is this? it forces you to turn on ad-tracker cookies to see the the actual content,eg the charts. Which I won't do out of principle. Not sure if this is actually allowed according to gdpr.

No idea, as European i am already used to all that cookie and gdpr bs, i just click "agree to sell your soul".
They are respected gaming oriented site and Digital Foundry videos about game engines and visuals are probably the best.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,024
32,487
146
What crap site is this? it forces you to turn on ad-tracker cookies to see the the actual content,eg the charts. Which I won't do out of principle. Not sure if this is actually allowed according to gdpr.
Their results had me confused for a moment; I was asking myself why they did not sync with others I have seen. The answer? Stock Prism cooler on all Ryzens, Gamer Storm Castle 240mm AiO for all Intel = I throw out your entire review. I am a fan of the DF YT channel, but this was unacceptable.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Their results had me confused for a moment; I was asking myself why they did not sync with others I have seen. The answer? Stock Prism cooler on all Ryzens, Gamer Storm Castle 240mm AiO for all Intel = I throw out your entire review. I am a fan of the DF YT channel, but this was unacceptable.
Least we are starting to realize this now. It was a big issue with the initial Ryzen 1k reviews. Most reputable reviewers are good about it now. But I was shocked how many places were cool with using the OEM cooling on Ryzen's because it was the cooler that came with it but any old cooler (always a 240 AIO or larger) with Intel because they didn't provide one.
 
  • Like
Reactions: lightmanek

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
The answer? Stock Prism cooler on all Ryzens, Gamer Storm Castle 240mm AiO for all Intel = I throw out your entire review.

So basically everyone using stock AMD cooler is loosing ~10% of gaming performance, ouch :) And gaming is not exactly saturating the cores, MT loads are probably massacring it. Something to keep in mind when comparing Intel vs AMD in the future.

I have looked around for other reviews with DRAM @3600, they come from less reputable sources like YT channels, but the performance difference is still there and 10+%. So there is some truth to DF video, but unfortunately it is tainted with poor cooling and GDPR issues.

DF response to cooler problems in the comments, but i think it is not true, as other reviews show less of advantage for Intel @3600 RAM speed:

With regards to the cooler, the Wraith Prism is a damn good cooler (if a little loud at max) and should offer roughly equivalent performance to our 240mm AiO given that both chips are operating at stock frequencies - we didn't see any thermal throttling from our AMD CPUs in our cool ambient temps, so the default power limits should be the limiting factor here. We also note several times that this is a big downside for Intel, as their stock coolers either don't exist or kinda suck so you do need to get a third-party option, whether that's air or water. This is why we mention several times that we prefer AMD's ecosystem/approach. However, we accept that the test could be more apples-to-apples and we will improve this for the future to remove different coolers as a possible factor.
 
Last edited:
  • Like
Reactions: Gideon

Gideon

Platinum Member
Nov 27, 2007
2,030
5,035
136
Their results had me confused for a moment; I was asking myself why they did not sync with others I have seen. The answer? Stock Prism cooler on all Ryzens, Gamer Storm Castle 240mm AiO for all Intel = I throw out your entire review. I am a fan of the DF YT channel, but this was unacceptable.
They also don't seem to use custom timings for memory (that would benefit Ryzen slightly more)

IMO not using the same cooler is totally unacceptable, using a mid-range air cooler on Intel would have definitely reduced the scores while using AIO on Ryzen would have barely moved the needle (so why not use AIO on both?)

One can argue that an AIO would be a no-brainer for 10900K but for 10600K it's certainly not the case (youd' probably be better off with a 3600XT and a better graphics card).

Regardless, what the results do how is that if you have money for a 10600K+ and a decent AIO (+150$) you'll get 15-20% more frames than with any AMD setup, even at 1440p. That's a relevant lead (particularily once newer GPUs are released making 2080 Ti class perormance more achievable).

Sure, you can nullify Intel's advantage by upping the resolution to 4K but considering the performance drop using 1440p with some high end upsampling (DLSS 2.0 level) might be the smarter choice.

Can't wait to see what the XT models and Chezanne bring to the table (and of course Rocket Lake)
 

Gideon

Platinum Member
Nov 27, 2007
2,030
5,035
136
So basically everyone using stock AMD cooler is loosing ~10% of gaming performance, ouch :) And gaming is not exactly saturating the cores, MT loads are probably massacring it. Something to keep in mind when comparing Intel vs AMD in the future.

The difference is that large? :O

IMO not using AIO can certainly regress Intel scores by that amount, but will it truly help AMD that much? (I had the undestanding that AMD is mostly limited by memory-latency)
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
I have looked around for other reviews with DRAM @3600, they come from less reputable sources like YT channels, but the performance difference is still there and 10+%. So there is some truth to DF video, but unfortunately it is tainted with poor cooling and GDPR issues.
Kitguru uses 3600CL16 RAM and same cooler, and they're reputable.
10900 OC 5.2 GHz vs 3900X OC 4.25 GHz with real coolers.
So giving Intel the overclocking advantage too.
Even at 1080p average lead was nowhere near 20% on average, once you put a real cooler on both chips.
 
  • Like
Reactions: lobz and lightmanek

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,024
32,487
146
Kitguru uses 3600CL16 RAM and same cooler, and they're reputable.
10900 OC 5.2 GHz vs 3900X OC 4.25 GHz with real coolers.
So giving Intel the overclocking advantage too.
Even at 1080p average lead was nowhere near 20% on average, once you put a real cooler on both chips.
I have seen some impressive 1 percent low advantages with the new Intel, over 30 percent in some games. But when you average out all the games tested, you get pedestrian single digit gains. The rest is hyperbole.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Eurogamer has great review of 10900K, they were using 3600CL16 memory for both AMD and Intel and 2080TI.

Gaming results are here.

I think results are eye opening. Even @1440P Intel has ~20% advantage over AMD best and 4C CPU like 3300X look horribad in some games ( clearly 4C even in forum darling config do not work anymore in high end gaming ). 20% gap between AMD and Intel is like paying for 2080TI but getting performance 2080 instead.

Todays high end gaming will be GTX 4060 after 2 gens, but obviously current 3950x owners will sport 5950x ZEN4 and 4080TI?
This is odd. I looked at this on my phone and the difference in mean average FPS at 1440p was <20% in all tests, and 11% on average, even with the stock cooler.
Can you post an image of what stats you're seeing? It doesn't make sense that it's showing much lower differences for me.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
This is odd. I looked at this on my phone and the difference in mean average FPS at 1440p was <20% in all tests, and 11% on average, even with the stock cooler.
Can you post an image of what stats you're seeing? It doesn't make sense that it's showing much lower differences for me.


Well, for example in Odyssey:
1590586461118.png


18FPS advantage is 21.17% ?
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136

Anyone know if H and B boards will allow the uncapped power? I really think he stopped short of doing the review properly. If all the B and H boards will not allow uncapped PL2 (or run it that way by defualt), I wonder how it would have performed. Weird that he is the first to bring this up, at least from what I have seen.
 
  • Like
Reactions: lightmanek

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Well, for example in Odyssey:
View attachment 21699


18FPS advantage is 21.17% ?
It appears you don't know what you're looking at.

The numbers you circled are the instantaneous FPS at time = 0 in the test video. If you hit play on the video, you will note that the FPS changes as the video progresses. The actual average FPS are in the chart below the video's real-time numbers, where it says, "Overall Frame-Rate (FPS)".

FWIW, the mean average FPS at 1440p on ACO for 10900K is 76.3, and for 3900X is 69.0.

And the overall difference between a 10900K with an AIO, and a 3900X with stock cooler, is 11% at 1440p.

And yes, that is an otherwise informative and reputable site, outside of the ridiculous ads and the fact that they aren't keeping as many variables stable as possible.

;)
 
Last edited:

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
They can be. You are still restricted to the per core multiplier limits.
Okay so in that sense they basically have an upper limit, its not based on power limits, but it will never go above the max turbo value's Intel sets because they don't allow the multiplier to go up higher correct?

Wonder if anybody has power numbers. While it doesn't make a difference on desktop chips imho, I do wonder what the affect is on cooling requirements. If the performance on the 10400 would be maintained on the OEM one (where the 3600 can if a bit noisy as it tops out at 85w ish)? Comparing CPU's has gotten so much harder as of late. What defualts the mobo's are offering, cooling, memory choices. Any variance seems to blow up a review.
 

jpiniero

Lifer
Oct 1, 2010
16,816
7,258
136
Okay so in that sense they basically have an upper limit, its not based on power limits, but it will never go above the max turbo value's Intel sets because they don't allow the multiplier to go up higher correct?

Yep. So on the 10400 you won't get more than 4 Ghz on all cores. But you should be able to have it boost to that indefinitely.
 
  • Like
Reactions: lightmanek

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Comparing CPU's has gotten so much harder as of late. What defualts the mobo's are offering, cooling, memory choices. Any variance seems to blow up a review.
Agree.

I respect AT's stance of using only the actual manufacturer default settings. The out of the box experience is all you can really control. Though pursuing the highest settings able within the warranty envelope, as well as the highest settings able period, would be useful too.

That's why I do like Kitguru, TPU, Techspot w/r/t decent overclocking results, and they also seem to keep most of the stuff standardized otherwise.

The variability between the RAM, GPU (1080 for AT, 2080 for other sites, etc), and even mobo might create more differences between two reviews than anticipated (cf TPU's reviews of X570 boards, up to 10% difference between X570 and X470 in wPrime 1024, AIDA64 latency, and up to 20% difference in CB20, 15-20% difference in Blender, up to 5-10% difference in gaming, differences in memory clocks achieved, etc.). So a review of 3700X at stock with an X470 board might clip gaming performance by 5-10% compared to a review placing it in an X570 board. Really interesting to think about the next time you read a review - make sure you find ones with test setups similar to the one you're planning.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
It appears you don't know what you're looking at.

Good point, i have been humbled, the way that chart works, it did not let me select 1440P CPUs for comparison on graph ( since some were selected on 1080P) and i thought it is some sort of summary for that resolution :)

FWIW, the mean average FPS at 1440p on ACO for 10900K is 76.3, and for 3900X is 69.0.

And the overall difference between a 10900K with an AIO, and a 3900X with stock cooler, is 11% at 1440p.

11% is about expected in that setup, i doubt cooling impacts it at that much. Higher end tuning with 1900FCL and 3800CL14ish with tuned timings would erase those averages even more, probably to 5% or so.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Good point, i have been humbled, the way that chart works, it did not let me select 1440P CPUs for comparison on graph ( since some were selected on 1080P) and i thought it is some sort of summary for that resolution :)



11% is about expected in that setup, i doubt cooling impacts it at that much. Higher end tuning with 1900FCL and 3800CL14ish with tuned timings would erase those averages even more, probably to 5% or so.
360mm AIO vs Wraith Prism doesn't impact the 3900X much if at all.
We don't know how much it would impact the 10900K if we put the Prism (or equivalent) on it instead of an AIO.
 

ondma

Diamond Member
Mar 18, 2018
3,310
1,697
136
Well, for example in Odyssey:
View attachment 21699


18FPS advantage is 21.17% ?
Agree.

I respect AT's stance of using only the actual manufacturer default settings. The out of the box experience is all you can really control. Though pursuing the highest settings able within the warranty envelope, as well as the highest settings able period, would be useful too.

That's why I do like Kitguru, TPU, Techspot w/r/t decent overclocking results, and they also seem to keep most of the stuff standardized otherwise.

The variability between the RAM, GPU (1080 for AT, 2080 for other sites, etc), and even mobo might create more differences between two reviews than anticipated (cf TPU's reviews of X570 boards, up to 10% difference between X570 and X470 in wPrime 1024, AIDA64 latency, and up to 20% difference in CB20, 15-20% difference in Blender, up to 5-10% difference in gaming, differences in memory clocks achieved, etc.). So a review of 3700X at stock with an X470 board might clip gaming performance by 5-10% compared to a review placing it in an X570 board. Really interesting to think about the next time you read a review - make sure you find ones with test setups similar to the one you're planning.
I pretty much ignore AT's gaming reviews. Sticking with supported memory speeds makes sense for locked chips, but certainly anyone who buys a K cpu and motherboard is going to use fast ram. And using a 1080 (not even ti) as the gpu? Absurd.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Kitguru uses 3600CL16 RAM and same cooler, and they're reputable.
10900 OC 5.2 GHz vs 3900X OC 4.25 GHz with real coolers.
So giving Intel the overclocking advantage too.
Even at 1080p average lead was nowhere near 20% on average, once you put a real cooler on both chips.
They are also only testing 5 games and are using the 2080 super instead of the ti and the OC they did is pretty useless and even often hurts performance since they "overclock" below top single core boost.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Good point, i have been humbled, the way that chart works, it did not let me select 1440P CPUs for comparison on graph ( since some were selected on 1080P) and i thought it is some sort of summary for that resolution :)



11% is about expected in that setup, i doubt cooling impacts it at that much. Higher end tuning with 1900FCL and 3800CL14ish with tuned timings would erase those averages even more, probably to 5% or so.
What it impacts is the price difference, which can easily mean the difference towards better RAM / GPU.
Also, if you don't buy the cheapest RAM, tuning the timings is free.