Intel Shows That Their 9th Gen Core CPU Lineup Is Faster Than AMD Ryzen 3000 In Everything Except Cinebench

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,541
14,495
136
Everything depends on the perspective and optimization.

In case of Overwatch, and in general - Blizzard games, equally clocked part from Intel will behave much, much better than equally clocked CPU from AMD,, that has twice the amount of threads. In Overwatch, 8400/9400F, are not competing with 3600/3600X from AMD. They are competing with 3800X, and even higher.

Lightly threaded games. Its just that.
So in this one game, representative of all b Bizzard games, the slowest AMD CPU tested did 214 fps compared to 253 fps for the fastest Intel. You think without an FPS counter any normal human could even tell the difference ? we used to want 60 fps mins, and the slowest AMD does 125 fps.

I have no idea why you can justify saying "much, much better" when its not even possible to tell the difference.
 

Glo.

Diamond Member
Apr 25, 2015
5,704
4,548
136
So in this one game, representative of all b Bizzard games, the slowest AMD CPU tested did 214 fps compared to 253 fps for the fastest Intel. You think without an FPS counter any normal human could even tell the difference ? we used to want 60 fps mins, and the slowest AMD does 125 fps.

I have no idea why you can justify saying "much, much better" when its not even possible to tell the difference.
If you want to have 240 Hz experience, yes, it is and will make a difference.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,541
14,495
136
If you want to have 240 Hz experience, yes, it is and will make a difference.
Do you have any benchmarks to prove that in overwatch ? or any other blizzard game ? How is it that the average of MANY games, the difference is 5%, and you say that is much,much better ?
 
  • Like
Reactions: Drazick

Glo.

Diamond Member
Apr 25, 2015
5,704
4,548
136
Do you have any benchmarks to prove that in overwatch ? or any other blizzard game ? How is it that the average of MANY games, the difference is 5%, and you say that is much,much better ?
Every single time I EXPLICITLY state that this is the case for Overwatch/Blizzard games, or in the context of this. And nothing else.

Why do you find it so offensive for your world view, that in this particular case, which is Blizzard games, saying that Intel provides much, much better experience is so outrageous? Confirmation bias?

Also going from 140 FPS average in this particular case, to 155 FPS, with just change of CPU, is 10%. It is much, much better experience, don't you think?

You can afford higher fidelity of the game, this way, for less.
 
  • Like
Reactions: CHADBOGA

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,446
20,439
146
Every single time I EXPLICITLY state that this is the case for Overwatch/Blizzard games, or in the context of this. And nothing else.

Why do you find it so offensive for your world view, that in this particular case, which is Blizzard games, saying that Intel provides much, much better experience is so outrageous? Confirmation bias?

Also going from 140 FPS average in this particular case, to 155 FPS, with just change of CPU, is 10%. It is much, much better experience, don't you think?

You can afford higher fidelity of the game, this way, for less.
I am with the other member; calling 10 percent much much better is hyperbole.

And were any of those results you linked in a Linux distro? Since that is what you will be playing in, only those results are relevant.

And great job defending Intel's honor, in a thread that is pointing out how shamelessly sleazy they are being....again. Each time they get murked by AMD, they start these shenanigans. But here you are to tell us how your edge case scenario somehow lessens the sleaziness? Honestly, I have no idea where you are going with this, so that is just a guess. Are you defending their latest nonsense, just some of it, or just being the contrarian? Because whatever it is, it has not altered my negative opinion of Intel at the moment, over how low they will go. They could beat Hermes Conrad in a Limbo contest at this point.
 

rbk123

Senior member
Aug 22, 2006
743
345
136
Why do you find it so offensive for your world view, that in this particular case, which is Blizzard games, saying that Intel provides much, much better experience is so outrageous? Confirmation bias?

Also going from 140 FPS average in this particular case, to 155 FPS, with just change of CPU, is 10%. It is much, much better experience, don't you think?

Not offensive, just disagree. It's the same as my point - 10% improvement is nice, but small. Certainly not "much much better" or "run circles around". It's all relative. 50% better would be something more in line with either of those statements.

The AMD fans are guilty at times as well when describing certain areas where AMD outperforms Intel. Just trying to make things less emotional and more factual.
 
  • Like
Reactions: Glo.

Hitman928

Diamond Member
Apr 15, 2012
5,243
7,790
136
Everything depends on the perspective and optimization.

In case of Overwatch, and in general - Blizzard games, equally clocked part from Intel will behave much, much better than equally clocked CPU from AMD,, that has twice the amount of threads. In Overwatch, 8400/9400F, are not competing with 3600/3600X from AMD. They are competing with 3800X, and even higher.

Lightly threaded games. Its just that.

The head to head benchmarks don't support your statement though. Both head to head benchmarks posted in this thread show Ryzen CPUs performing better than intel CPUs at fps/MHz. We don't have the exact CPUs in question in head to head benchmarks, but what we do have shows that a lower clocked intel CPU (like an 8400 or 9400F) would have a hard time outperforming to any significant degree even the lowest Ryzen 3000 CPU. Only way I could see this not being true is if there's a bug in the game right now that effects only Ryzen 3000 CPUs with 6 cores.

You seem to be coming to this conclusion by taking 2 different gameplay videos from 2 completely different sessions, not even on the same map, and then trying to compare performance. There's so little control and so much room for error there, this is not how to do it. Again, there are head to head benchmarks provided that show Ryzen CPUs doing better MHz for MHz against intel CPUs. The Intel CPUs need a significant clock speed advantage in order to have a higher frame rate.

BTW, from your comparison of different videos you posted, the 2700x would seem to have a 28% performance lead over a 3600x. Does anyone believe this is accurate when the 3600x is the better gaming CPU? I looked at the wrong number for 2700x here but I stand by the rest of the post.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,704
4,548
136
I am with the other member; calling 10 percent much much better is hyperbole.

And were any of those results you linked in a Linux distro? Since that is what you will be playing in, only those results are relevant.

And great job defending Intel's honor, in a thread that is pointing out how shamelessly sleazy they are being....again. Each time they get murked by AMD, they start these shenanigans. But here you are to tell us how your edge case scenario somehow lessens the sleaziness? Honestly, I have no idea where you are going with this, so that is just a guess. Are you defending their latest nonsense, just some of it, or just being the contrarian? Because whatever it is, it has not altered my negative opinion of Intel at the moment, over how low they will go. They could beat Hermes Conrad in a Limbo contest at this point.
Finally one of most interesting questions.

Yes, there is basically no difference between Linux and Windows at least in most cases, and most games. There are games that perform worse on Linux, regardless of GPU brand, and there are some cases where games are performing BETTER under Linux than on Windows. That is the only benefit of Nvidia proprietary GPU drivers. They are optimizing them, so here are effects.

Its funny, that one can take out something out of context, during the discussion, and draw a picture out of it. You missed the post in which I explicitly said that Intel has no competition in lower end CPUs, and Mobile ones, because in those brackets AMD cannot compete with Intel? And what about them? Doesn't Intel have a point in saying that in some use cases they are still relevant?

I was all for AMD. I planned on bying AM4 with Ryzen 5 3600. Until I saw benchmarks from Overwatch, and compared the TCO of the platform compared to Intel, for my specific use case.

The head to head benchmarks don't support your statement though. Both head to head benchmarks posted in this thread show Ryzen CPUs performing better than intel CPUs at fps/MHz. We don't have the exact CPUs in question in head to head benchmarks, but what we do have shows that a lower clocked intel CPU (like an 8400 or 9400F) would have a hard time outperforming to any significant degree even the lowest Ryzen 3000 CPU. Only way I could see this not being true is if there's a bug in the game right now that effects only Ryzen 3000 CPUs with 6 cores.

You seem to be coming to this conclusion by taking 2 different gameplay videos from 2 completely different sessions, not even on the same map, and then trying to compare performance. There's so little control and so much room for error there, this is not how to do it. Again, there are head to head benchmarks provided that show Ryzen CPUs doing better MHz for MHz against intel CPUs. The Intel CPUs need a significant clock speed advantage in order to have a higher frame rate.

BTW, from your comparison of different videos you posted, the 2700x would seem to have a 28% performance lead over a 3600x. Does anyone believe this is accurate when the 3600x is the better gaming CPU?
You assume that I do not play Overwatch currently. And that I do not know how it behaves. For your information, I do play this game, everyday, on an Nvidia GPU. I know how this game behaves. The variation between maps in average Framerate is 3 FPS.

I think you haven't watched the films correctly. Compare the FPS from 2.9 GHz/4.1 GHz with all core boost at 3.9 GHz Core i5 9400F 6C/6T CPU, that achieves 152 FPS in Overwatch 1080p Epic settings, and compare this to 3.8 GHz/4.4 GHz, 6C/12T CPU that achieves 140 FPS average in the same game. What was the comparison of performance/Mhz again in this particular use case?

BlizzardWorld map in Overwatch is one of the most demanding, in which you see the lowest framerate, apart from Horizon Lunar Colony(No idea why), and probably, newest map in the pool: Busan. BlizzardWorld is tested in few of those films, and is explicitly apparent in testing of Epic Settings in 9400F/GTX 1660 Ti.

And lastly. You may find it shocking, for your view of the world. In this particular game, 6C/6T Intel CPU is faster than 8C/16T from AMD.
 
  • Like
Reactions: Burpo and CHADBOGA

Hitman928

Diamond Member
Apr 15, 2012
5,243
7,790
136
I think you haven't watched the films correctly. Compare the FPS from 2.9 GHz/4.1 GHz with all core boost at 3.9 GHz Core i5 9400F 6C/6T CPU, that achieves 152 FPS in Overwatch 1080p Epic settings, and compare this to 3.8 GHz/4.4 GHz, 6C/12T CPU that achieves 140 FPS average in the same game. What was the comparison of performance/Mhz again in this particular use case?

BlizzardWorld map in Overwatch is one of the most demanding, in which you see the lowest framerate, apart from Horizon Lunar Colony(No idea why), and probably, newest map in the pool: Busan. BlizzardWorld is tested in few of those films, and is explicitly apparent in testing of Epic Settings in 9400F/GTX 1660 Ti.

And lastly. You may find it shocking, for your view of the world. In this particular game, 6C/6T Intel CPU is faster than 8C/16T from AMD.

Again, I'm using the head to head benchmarks posted in this thread that you seem to completely ignore over and over again. I also don't see what the number of cores/threads has to do with it (at least since we're not discussing 2C/4T or less CPUs) when you said it's a lightly threaded game to begin with. Is it possible that the head to head benchmarks aren't accurate of gameplay? Sure, but I trust them a lot more than I do taking snippets of gameplay from random sessions/maps and trying to compare that way.

From the screenshot I posted, it doesn't look like intel's 6C/12T CPU (8700K) is faster than an 8C/16T AMD (3700x) let alone a 6C/6T. Yes the 8700k has an average fps that's 3.9% faster (with a more than 5% clock speed advantage which is the key difference) but it also has minimums that are 5.5% lower. I would say they are pretty even.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,617
10,825
136
Well guess, what - that is the best AMD offers RIGHT now in desktop space.

Um. No? 3900x is the best, unless you're talking about a very narrow price range (which was never the case in Intel's obviously-false benchmark narrative).

Also, what makes you believe Renoir is 8 core APU, and not just 4 Core one?

When did I ever say that it was? We don't know yet. It would be easy for AMD to make it 8c thanks to the improvements in density, but 8c + fat iGPU might make for poor yields. So we'll see.

I won't wait for next year products

Don't worry. With Intel, you don't have to. Ha ha. Ahem.

Every single time I EXPLICITLY state that this is the case for Overwatch/Blizzard games, or in the context of this.

But Intel didn't make that statement.
 

Panino Manino

Senior member
Jan 28, 2017
820
1,022
136
Summary please sort of don’t have the time to watch a 15 minute video.

TL&DW:
  • Their "0.22%" is from Notebooks and 2 in 1s , not Desktop nor HEDT.
  • Bottlenecks of the programs on the top of the list are not CPU related
  • Those that are CPU bottlenecked lower in the list are already benchmarked by reviewers and show for the most part that AMD is faster and/or on par
 

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
Buddy of mine was shocked when i said if he wanted to get a uber chip to replace his 7700k its a 3900x. Hard to argue with the price and 12c/24t and pcie-4.0 .

I have always loved Intel but yeah if i was putting my computer to work or building something new it wouldn't be Intel that is for sure. Not that i don't love my 8700 cause i do.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Thought this review of the 3900x over at bestbuy.com was pretty funny. Guess the guys excited about his purchasing decisions.

Duck

Rated 5 out of 5 stars

Thing's an absolute monstrosity
Verified Purchaser
Verified Purchase|
Posted 2 weeks ago.

Paired it with an ASUS X570 TUF Gaming (Wifi) motherboard with 32Gb of DDR4 3200 RAM and a Radeon RX 5700 XT 50th Anniversary Edition, you know, just to stick it to the Man. Thing's an absolute monstrosity, chews through any workload like Garfield through a lasagna. Versus Intel's i9 9900k, you won't see a difference in gaming unless you're counting a few frames difference that you'd never notice without the FPS counter. The kicker is the multithreaded performance. If you're, say, running high resolution video editing or rendering programs, then the Ryzen 9 3900X humiliates the i9 9900k. Indeed, it stuffs the 9900k into its locker with a bag of dirty gym socks. Then it takes the 9900k's girlfriend to the senior prom. Then it cheats on her with the 9900k's sister. If you want a workstation that doesn't skimp on gaming prowess, look no further. The Ryzen 9 3900X is the new kid in town, new starting quarterback, class president, valedictorian, dating the captain of the cheerleaders.

Nope....Their sold out.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
TL&DW:
Their "0.22%" is from Notebooks and 2 in 1s , not Desktop nor HEDT.
https://www.intel.com/content/www/u...ore/9th-gen-core-mobile-processors-brief.html
intel's notebooks include 8c/16t CPUs,basically intel notebook will mirror desktop numbers or at least be close enough not to worry about it,most people will have 2 or 4 cores max and a handfull will have the top chips,hence the 0.22,that's pretty much the percentage of people that buy monster systems,because they have to do work on them...like rendering,not for e peen.
This number will be much higher in the server or workstation market but the 9900k is a mainstream CPU, it's in the same category as the celeron...
TL&DW:
Bottlenecks of the programs on the top of the list are not CPU related
Just because they don't load the CPU to 100% doesn't mean they don't issue hundreds or thousands of context switches between all available cores (spanning ccxs) or maybe they are very branchy or plain low IPC sequential code,it will max out the CPUs capability to do a certain thing without showing 100% usage.
 

DrMrLordX

Lifer
Apr 27, 2000
21,617
10,825
136
https://www.intel.com/content/www/u...ore/9th-gen-core-mobile-processors-brief.html
intel's notebooks include 8c/16t CPUs,basically intel notebook will mirror desktop numbers or at least be close enough not to worry about it

Please don't tell me you're actually trying to defend Intel's methodology here. It's a hack job, can't you tell?

Just because they don't load the CPU to 100% doesn't mean they don't issue hundreds or thousands of context switches between all available cores (spanning ccxs) or maybe they are very branchy or plain low IPC sequential code,it will max out the CPUs capability to do a certain thing without showing 100% usage.

. . . yeah, you're trying to defend their methodology. Ugh.

Look, nobody said anything about CPU utilization. He said the programs are bottlenecked by something else. When that happens, it means it is the bottleneck (GPU, RAM, storage, whatever) that determines application performance. Not the CPU.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I just sold my 2 year can't-OC-for-(Redacted) 8700K for a 3600, made a small profit, a cooler running chip, and also got slightly more CS:GO framerates in CPU-limited scenarios to boot on the same DDR4-3200. Thanks AMD!

Even in gaming the 3600 trounces the 9400F and 9700K in perf/dollar which are the only 2 Intel CPUs left that are still worth considering. The rest of Intel lineup is just hot garbage, especially those with the security swiss cheese HT.

You already know that profanity is not allowed in the tech forums.

Daveybrat
AT Moderator
 
Last edited by a moderator:

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
I just sold my 2 year can't-OC-for-<redacted>8700K for a 3600, made a small profit, a cooler running chip, and also got slightly more CS:GO framerates in CPU-limited scenarios to boot on the same DDR4-3200. Thanks AMD!

Even in gaming the 3600 trounces the 9400F and 9700K in perf/dollar which are the only 2 Intel CPUs left that are still worth considering. The rest of Intel lineup is just hot garbage, especially those with the security swiss cheese HT.
Okay, enjoy your gaming downgrade. How's overclocking on that 3600 going for you? By the way, slower and cooler usually go hand in hand. Don't believe me? Try clocking that 3600 anywhere near those 8700k clocks. Goodluck!!!
 
Last edited by a moderator:
  • Haha
Reactions: spursindonesia

Glo.

Diamond Member
Apr 25, 2015
5,704
4,548
136
I just sold my 2 year can't-OC-for-<redacted>8700K for a 3600, made a small profit, a cooler running chip, and also got slightly more CS:GO framerates in CPU-limited scenarios to boot on the same DDR4-3200. Thanks AMD!

Even in gaming the 3600 trounces the 9400F and 9700K in perf/dollar which are the only 2 Intel CPUs left that are still worth considering. The rest of Intel lineup is just hot garbage, especially those with the security swiss cheese HT.
With 3600 you got basically the same CPU, as you sold, for much less. 3600 is equivalent to 8700K.

In your case I agree with most of the words, however...

There are games in which 9400F, and in general Intel CPUs will be better option than AMD CPU. Like, for example, Blizzard games. If that is all anyone plays its better to use Intel CPUs. And the difference is huge, in those scenarios.
 
Last edited by a moderator:

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
So much bitterness.
Nope! It's the irony of an overclocker swapping an 8700k for a R5 3600 because...………….wait for it...…………... it "can't-OC-for-<redacted>" o_O That's quite the contradiction, if you ask me. But hey, whatever makes more sense for the overclocking inclined users these days, right? Hehe
 

Kirito

Junior Member
Jun 20, 2017
12
7
81
Nope! It's the irony of an overclocker swapping an 8700k for a R5 3600 because...………….wait for it...…………... it "can't-OC-for-<redacted>" o_O That's quite the contradiction, if you ask me. But hey, whatever makes more sense for the overclocking inclined users these days, right? Hehe
Its not a contradiction if you look Slightly further than cherry picking the one thing he mentioned in his post, overlooking the performance increase he is wanting to achieve with his oc in the first place! Once again, achieved performance matters, not clocks.