Discussion Comet Lake Intel's new Core i9-10900K runs at over 90C, even with liquid cooling TweakTown

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
15,494
4,279
136
Which doesn't make much sense unless we see a significant drop in 9900K price.
See remark from @Topweasel .

I can't imagine how good an Intel CPU would have to be to change that...
How about a CPU that doesn't use Skylake cores? Is that so hard to imagine? Bring on 10nm already!

But really doesn't the 10700k already really accomplish all of this at an actual price decrease.
Yes.
 

biostud

Lifer
Feb 27, 2003
15,138
387
126
No it won't. Honestly Covid couldn't have happened at a better time for CPU stock for them giving them a chance to play catchup on their supply. I have to assume at this point that they are mixing adjusted Coffee Lake and Coffee Lake -R dies because if a significant amount of Comet Lake are 10c cores it's just going to make it worse. The needing to increase core count on consumer CPU's has to be a bit of a killer. We are already seeing situations where they either have to double up on dies (Cascade Lake 9k) and make them not really available. Or Lower price on high core products (10980xe) and not make them available. Knowing both if actually available in a normal way on the market would kill their margin and strain the supply even more. Having to increase die size on these high volume chips will tank the platter count they will have for their server chips. More than performance this is where Intel will be playing catchup the most. Intel didn't even want to move to 6 cores till they got on 10nm. To keep up with AMD they will have to keep the core counts going up at every level. Where AMD can utilize on multiple ways with die shrinks in their configurations, Intel will be so far behind on each change that compared to their original roadmaps they won't be getting much in the way of relief on die size per core that they had with KabyLake and earlier. Intel has peaked on their margins and there may not be a way to get back there, even if AMD has another BD moment.
The question is if Intel will have to switch to a mcm approach at some point. AMD has shown it is a viable path regarding performance and cost, and they will be be able to undercut Intel continuously if Intel does not do something drastic.

Personally I think optane DIMM in mainstream would give them something AMD can not bring to the table.

But what do I know.
 

piokos

Senior member
Nov 2, 2018
265
82
61
So if you do what I do and thaty is 100% 24/7, you would be throttling. As it would be at 96c all the time, and throttle down.
Yeah, well. I won't comment on how you use your CPUs. It's a free world.

And no, it wouldn't be at 96*C all the time, just like it doesn't run at PL2 all the time (let alone beyond it).
Phoronix gave the average values: 56*C and 124W - under load, over multiple tests.

BTW: is it really such a huge problem to write 96*C, not "96c"? :/
And it went up to 380 watts in the same tests. WOW !
It's peak consumption. It happens for a fraction of a second and is more a sign of CPU elasticity than anything else. And don't worry. If your PSU can't handle that, CPU won't make it explode.
If you make a long enough measurement and collect many data points, you'll see figures over PL2 on every Intel CPU.
For starters a 3900x comes with a cooler.
On 3900X under load Wraith Prism goes into berserk mode: almost 3000 rpm and hardly something you're able to sit next to without ear plugs. And even then it probably won't be enough during hot days. Check this:

So 3900X comes with a cooler - in the same way Intel's 9900 and 10900 come with a cooler.

Anyway, if you're OK with this kind of temperatures and noise, you can pair 10900K with a decent $20-30 solution. No need to spend $100.

As I mentioned earlier: let's not dig into every $1. Because we may soon notice that in order to achieve review-level Zen2 performance, you need more expensive RAM and so on.
I think we agree that Intel has a higher entry cost (especially as long as there are only Z490 boards). Let's leave it there.
 
  • Like
Reactions: Arkaign

biostud

Lifer
Feb 27, 2003
15,138
387
126
If you have a budget where you can buy a 2080 og 2080ti gaming and run 1080p then the 9900K an 10900k might make sense. For most people who have a smaller budget, it would be better to go for the 3600x and pu as much $ towards the GPU. And if you run 1440p+ resolution the one digit performance difference between an Intel setup and AMD is something you will never notice outside benchmarks.
 
  • Love
Reactions: spursindonesia

DrMrLordX

Lifer
Apr 27, 2000
15,494
4,279
136
Meh, bring on 7nm already :p
Sure, why not? But 10nm is here today, in one form or another. 7nm is still a big unknown.

If you have a budget where you can buy a 2080 og 2080ti gaming and run 1080p then the 9900K an 10900k might make sense.
That's been true since 2018, though. If you already have a 9900k and a card like that, the 10900k makes almost zero sense, unless you are just hell-bent and determined to strap two more cores to your gaming rig for whatever reason.
 

Topweasel

Diamond Member
Oct 19, 2000
5,323
1,497
136
Yeah, well. I won't comment on how you use your CPUs. It's a free world.

And no, it wouldn't be at 96*C all the time, just like it doesn't run at PL2 all the time (let alone beyond it).
Phoronix gave the average values: 56*C and 124W - under load, over multiple tests.
That's at base settings where it really doesn't beat the 3900x except on ST and ST heavy MT loads, like a lot of the specific Adobe tests phoronix runs. To keep up with but still not regularly beat the 3900x. This is where you make have to make a decision, Are you trying to get the 10900k to run at stock, fight but ultimately not beat the 3900x, or regularly beat but not all the time the 3900x. This is what I was saying about on any singular level the 10900k looks great. It's as a package it suffers. So choose one goal post here please.

It's peak consumption. It happens for a fraction of a second and is more a sign of CPU elasticity than anything else. And don't worry. If your PSU can't handle that, CPU won't make it explode.
If you make a long enough measurement and collect many data points, you'll see figures over PL2 on every Intel CPU.
But it can. I say this as semi joke and generally I have never been one of those "right size" PSU people. But I have run on a PSU that for the most part is capable of the specs in question and how it handled increased power load. Mine was a not great PSU that handled the load less well as time went on. But sure there is an opportunity for a CPU that is using 150-200w over its rating causing a PSU to fail. Maybe not blow up. Maybe not expire but just trip its power protections and cause the system to reboot. This is easily dodged with knowledge. But requires everyone to know how much extra the 10900k uses over its 125w rating. This doesn't excuse AMD from the same thing, but 35-40w is a lot different than being off by 150-200w.
On 3900X under load Wraith Prism goes into berserk mode: almost 3000 rpm and hardly something you're able to sit next to without ear plugs. And even then it probably won't be enough during hot days. Check this:
So it works and allows the CPU to work within specs, but might struggle slightly in higher ambient temps. I didn't say I felt it likely people are using it but the option is there and will allow the CPU to perform as rated at no extra cost.
So 3900X comes with a cooler - in the same way Intel's 9900 and 10900 come with a cooler.

Anyway, if you're OK with this kind of temperatures and noise, you can pair 10900K with a decent $20-30 solution. No need to spend $100.

As I mentioned earlier: let's not dig into every $1. Because we may soon notice that in order to achieve review-level Zen2 performance, you need more expensive RAM and so on.
I think we agree that Intel has a higher entry cost (especially as long as there are only Z490 boards). Let's leave it there.
No the 3900x comes with a subpar but capable cooler.

But it comes back to my statement earlier. What of the the 3 versions of 3900x competitiveness do you want your 10900k. Find me a $20 dollar cooler that lets the 10900k run at stock with its 125w power usuage, 200w 30 second spike, while still losing pretty much across the board on all MT loads against the 3900x (bring up the question why you would pay an extra $100-$125 for it. Or find a $20 cooler that can work with the unlimited 200w usage of a CPU that can come close but not quite beat the 3900x. Or Find a $20 cooler that can take the 250w-300w 3900x beating heat of a full MCE enabled setup.

Under all three you have a TVB that probably won't work more than once a session and two settings that will have the CPU always against its thermal limits. But in the end lets just apply the Gamers Nexus noise stabilized option. With the 3900x you have the option for a no cost, super load solution. One you start doing 3rd market, at every stage to normalize experience you have to invest that much more cooling to keep up with the 3900x performance and noise levels. Get an H100i, a 3900x has a fully actualized performance and runs quieter, sure maybe the first option its noise levels are similar. But then why get it over the 3900x? Second level. The H100i is probably enough. But fans have to run faster meaning noiser. The 3rd the H100i might not really be able to handle it. But then what does it take to match a 3900x at those two stages and keep the same noise levels.

That's where most of this is coming from. This isn't just a simply this CPU needs to use more power. But its performance varies widely based on board, cooling, and everything else. On top of that. There is a real cost addition not just to cool it in the most basic sense. But at every level of implementation of the 3900x, getting the 10900k to that level costs that much more.

Also for the ram. Zen doesn't "Require" better ram. It runs better with better memory. It's see a larger increase in performance with better memory. But not at some level that makes it not worth giving the Intel solution similar memory. If it does one thing is that it might slightly prioritize clock speed over latency. But its the generally the lower latency that pushes up memory prices up (well into you get to levels where the higher memory speed is a hindrance to Ryzen.
 

Ajay

Diamond Member
Jan 8, 2001
6,821
2,197
136
Sure, why not? But 10nm is here today, in one form or another. 7nm is still a big unknown.
Yeah, but won't have decent perf/cores till 2022? Unless the specs are wrong on Adler lake, it looks like some specialty chip - maybe mobile still.
If it's desktop, the 8 small cores make no sense.
 

RetroZombie

Senior member
Nov 5, 2019
410
297
96
Yes poor intel,ever since they have to compete with ZEN they are only making twice the net income (money in the pocket) they did before ZEN came out.
That's not on intel, it's the market it self, the lack of parts, new cpus to substitute vulnerable intel ones, and of course stupid people.
You cant blame amd for a good product that doesn't have the deserved amount of sales increase.

Okay so we are either talking OEM systems.
OEM system with those ponny fan with 250Watt cpus, i expect oems to avoid all the high tdp intel cpus, unless they like returns and rma.

Which leaves us with a difference in CPU price - most of which would need to be spent on a GPU for the 3900X.
Someone buying the 10900K it will just use the igpu, are you sure?

So if you do what I do and thaty is 100% 24/7, you would be throttling. As it would be at 96c all the time, and throttle down. And it went up to 380 watts in the same tests. WOW !
That is one cpu line that will cause an huge amount of returns and rma's, already see many improperly assembled and mounted systems that will fail in a short time.
 

lobz

Golden Member
Feb 10, 2017
1,130
1,013
106
That's at base settings where it really doesn't beat the 3900x except on ST and ST heavy MT loads, like a lot of the specific Adobe tests phoronix runs. To keep up with but still not regularly beat the 3900x. This is where you make have to make a decision, Are you trying to get the 10900k to run at stock, fight but ultimately not beat the 3900x, or regularly beat but not all the time the 3900x. This is what I was saying about on any singular level the 10900k looks great. It's as a package it suffers. So choose one goal post here please.

But it can. I say this as semi joke and generally I have never been one of those "right size" PSU people. But I have run on a PSU that for the most part is capable of the specs in question and how it handled increased power load. Mine was a not great PSU that handled the load less well as time went on. But sure there is an opportunity for a CPU that is using 150-200w over its rating causing a PSU to fail. Maybe not blow up. Maybe not expire but just trip its power protections and cause the system to reboot. This is easily dodged with knowledge. But requires everyone to know how much extra the 10900k uses over its 125w rating. This doesn't excuse AMD from the same thing, but 35-40w is a lot different than being off by 150-200w.
So it works and allows the CPU to work within specs, but might struggle slightly in higher ambient temps. I didn't say I felt it likely people are using it but the option is there and will allow the CPU to perform as rated at no extra cost.


No the 3900x comes with a subpar but capable cooler.

But it comes back to my statement earlier. What of the the 3 versions of 3900x competitiveness do you want your 10900k. Find me a $20 dollar cooler that lets the 10900k run at stock with its 125w power usuage, 200w 30 second spike, while still losing pretty much across the board on all MT loads against the 3900x (bring up the question why you would pay an extra $100-$125 for it. Or find a $20 cooler that can work with the unlimited 200w usage of a CPU that can come close but not quite beat the 3900x. Or Find a $20 cooler that can take the 250w-300w 3900x beating heat of a full MCE enabled setup.

Under all three you have a TVB that probably won't work more than once a session and two settings that will have the CPU always against its thermal limits. But in the end lets just apply the Gamers Nexus noise stabilized option. With the 3900x you have the option for a no cost, super load solution. One you start doing 3rd market, at every stage to normalize experience you have to invest that much more cooling to keep up with the 3900x performance and noise levels. Get an H100i, a 3900x has a fully actualized performance and runs quieter, sure maybe the first option its noise levels are similar. But then why get it over the 3900x? Second level. The H100i is probably enough. But fans have to run faster meaning noiser. The 3rd the H100i might not really be able to handle it. But then what does it take to match a 3900x at those two stages and keep the same noise levels.

That's where most of this is coming from. This isn't just a simply this CPU needs to use more power. But its performance varies widely based on board, cooling, and everything else. On top of that. There is a real cost addition not just to cool it in the most basic sense. But at every level of implementation of the 3900x, getting the 10900k to that level costs that much more.

Also for the ram. Zen doesn't "Require" better ram. It runs better with better memory. It's see a larger increase in performance with better memory. But not at some level that makes it not worth giving the Intel solution similar memory. If it does one thing is that it might slightly prioritize clock speed over latency. But its the generally the lower latency that pushes up memory prices up (well into you get to levels where the higher memory speed is a hindrance to Ryzen.
This.

Pretty much everything you wrote.
 
  • Like
Reactions: spursindonesia

piokos

Senior member
Nov 2, 2018
265
82
61
That is a blatant lie. Please stop it.
Well, it's not that difficult to find temperature figures under load for Zen2 SoCs. You can start from the link I posted.
Really nothing I can add here.
That's at base settings where it really doesn't beat the 3900x except on ST and ST heavy MT loads, like a lot of the specific Adobe tests phoronix runs.
No. It's on the same settings that boost 380W peak power draw.
That's how modern CPUs work - constantly moving between power states. So we spend a lot of time discussing peak values, but averages are just as important. And kudos to Phoronix for showing more detailed results.

And I did say 3900X remains faster on average in that test set. And that Intel wins in particular tests. Not sure why you're repeating that. :)

I also don't understand the idea behind stressing that Adobe benchmarks are "specific". Just as specific as any other test. You can look at the average or you can look at particular loads you're interested in.
No the 3900x comes with a subpar but capable cooler.
So just like most Intel CPUs.
Also for the ram. Zen doesn't "Require" better ram.
I never said it "requires" better RAM. It's merely a reminder that you're basing your opinions on benchmark results that may have used different components.
If we used DDR4-2666 to test both 10900K and 3900X, it would probably have a significant impact on the outcome.
And by all means I'm not criticizing reviewers for doing that. They're showing what these platforms are capable of in a near best-case scenario, as they should.
But you're very eager to point out that AMD offers better value, so I'm just pointing on some things that you may be forgetting about. ;)
But not at some level that makes it not worth giving the Intel solution similar memory. If it does one thing is that it might slightly prioritize clock speed over latency. But its the generally the lower latency that pushes up memory prices up (well into you get to levels where the higher memory speed is a hindrance to Ryzen.
Here's the thing. On Intel side, overspending on RAM is probably the most common way of wasting money. Gains are relatively small and you can usually spend money in a more sensible way (on the CPU itself or functionality).
In general: on Intel side it's usually easier to build a PC using basic components, like slow RAM and cheap motherboards, while still getting fairly near the "review" performance levels. Of course the "cheap motherboard" part is not true for chips pulling 250W. ;)

On AMD side RAM plays a larger role and getting something better makes a lot more sense. Especially if one wants to get the same results he sees in reviews (or uses in forum discussions ;)).

In the end, it's pretty much what we should expect. Intel making CPUs for OEMs and AMD making CPUs for DIY enthusiasts. This has been true for all my PC-building life. And as a 15-year-old overclocker, I was a total AMD fanboy. No shame.
 

piokos

Senior member
Nov 2, 2018
265
82
61
Someone buying the 10900K it will just use the igpu, are you sure?
Many will use the 10900 (non-K) with IGP and I'm sure they would be very happy with a 10900K as well.
The issue with -K CPUs is in their key feature: being unlocked. I'd gladly pay more for better stock performance, but I also have to pay for ability to OC (which I won't use). So it usually doesn't make sense. And in case of the top model I'm also paying the premium for... having the top model - again, something I don't care about, but others do.

8700 and 8700K: 5% performance difference, almost 20% in price ($303 vs $359).
9900 and 9900K: again around 5% more oomph for 15% more money

But maybe it'll be different with Comet Lake. For starters, 10900K has higher max boost (100MHz but still) and is just 11% more expensive at launch.
 

Topweasel

Diamond Member
Oct 19, 2000
5,323
1,497
136
No. It's on the same settings that boost 380W peak power draw.
That's how modern CPUs work - constantly moving between power states. So we spend a lot of time discussing peak values, but averages are just as important. And kudos to Phoronix for showing more detailed results.
Dude I get power states and moving around. I watched and or read about 6 review of this CPU yesterday. I didn't see a single review talk about the CPU going towards 300W unless it was overclocked or MCE was on. Which makes me question that test and the settings it was running. What I did see was limited single run only spikes of about 200w for a max of 30s on high core count tests.
And I did say 3900X remains faster on average in that test set. And that Intel wins in particular tests. Not sure why you're repeating that. :)

I also don't understand the idea behind stressing that Adobe benchmarks are "specific". Just as specific as any other test. You can look at the average or you can look at particular loads you're interested in.
Adobe is one huge suite of tools and in photoshop especially but also in Premeire several toolsets within the program can be MT enabled, ST only, or lightly threaded and driven by clock speed. I mention Phoronix tests in specific tend to have little bit more weight towards ST tools within Adobe's suite. I don't how things work out in actual loads that people who use the software use. It's worth noting because you could find yourself using more MT related functions and find yourself making a weaker choice. Its why I tend to prefer the Adobe tests of specific functions that the tubers use for their video editing.

So just like most Intel CPUs.
Well you know except basically their whole k lineup and specifically this monster that pointed out can flash up to as you put it NEARLY 400W!!!!!
It's worth noting that two CPU's at similar prices, one doesn't require you to spend a dime to get advertised stock performance and the other requires you to make the right purchase to in a shock to me, at stock settings, be able to take momentary jumps to NEARLY 400W!!!!

I never said it "requires" better RAM. It's merely a reminder that you're basing your opinions on benchmark results that may have used different components.
If we used DDR4-2666 to test both 10900K and 3900X, it would probably have a significant impact on the outcome.
And by all means I'm not criticizing reviewers for doing that. They're showing what these platforms are capable of in a near best-case scenario, as they should.
But you're very eager to point out that AMD offers better value, so I'm just pointing on some things that you may be forgetting about. ;)
That dodges the point I made. You see pretty similar but not quite as high increases with Intel as you do AMD between memory steps. The IF speed increases are blown waaaaaay out of proportion. It's a slight but measurable bonus per upgrade and therefore a generalization is made. In the end if you think it would work fine partnered up with your $500 Intel CPU it will work fine with your $425.
Here's the thing. On Intel side, overspending on RAM is probably the most common way of wasting money. Gains are relatively small and you can usually spend money in a more sensible way (on the CPU itself or functionality).
In general: on Intel side it's usually easier to build a PC using basic components, like slow RAM and cheap motherboards, while still getting fairly near the "review" performance levels. Of course the "cheap motherboard" part is not true for chips pulling 250W. ;)
Yeah not buying it. With AMD you can still save if you want spending a 1/3rd on B450 or in a month B550 boards. Still getting all the features and functionalities of your CPU at a reduced cost board. And that's with VRM configurations almost making no difference.
On AMD side RAM plays a larger role and getting something better makes a lot more sense. Especially if one wants to get the same results he sees in reviews (or uses in forum discussions ;)).

In the end, it's pretty much what we should expect. Intel making CPUs for OEMs and AMD making CPUs for DIY enthusiasts. This has been true for all my PC-building life. And as a 15-year-old overclocker, I was a total AMD fanboy. No shame.
No no it doesn't. Its a generalization that probably across the board has a 1% difference on the affect of running one bin higher or lower. You don't need it. Just once you know why there is a difference people look at maximizing that. But Crap lets look at the difference for real.

G.skill Ripjaw V 16GB kit on newegg.
3200 = 69.99
3600 = 82.99
$13 dollars. $13 dollars from getting a hairs width from maximizing the IF.
What about going 2600? That Ripjaw V kit goes for.....drum roll..... 64.99. $5. For a total cost of best to worst. $18 to go from worst experience to best.

So you have a Z490 requirement that lets be nice is an extra $50 over a B450/B560 (assuming that's the next step down when it comes for Intel). Plus you need a cooler. I am going to be nice here and say $50. So to actually use the system as specced. You would be spending an additional $82 over an additional $50 (being nice) less expensive CPU. This can go on an on. This won't end. But you are talking about at this point spending $130 extra to get to the same out of the box experience reviewers can get out of the 3900x and talking about how one requires you spend an extra at best $18 more on memory. Even though the 10900K is going to run just as crappy with 2600 memory as the 3900x will.
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
20,137
7,637
136
Yeah, well. I won't comment on how you use your CPUs. It's a free world.

And no, it wouldn't be at 96*C all the time, just like it doesn't run at PL2 all the time (let alone beyond it).
Phoronix gave the average values: 56*C and 124W - under load, over multiple tests.

BTW: is it really such a huge problem to write 96*C, not "96c"? :/

It's peak consumption. It happens for a fraction of a second and is more a sign of CPU elasticity than anything else. And don't worry. If your PSU can't handle that, CPU won't make it explode.
If you make a long enough measurement and collect many data points, you'll see figures over PL2 on every Intel CPU.

On 3900X under load Wraith Prism goes into berserk mode: almost 3000 rpm and hardly something you're able to sit next to without ear plugs. And even then it probably won't be enough during hot days. Check this:

So 3900X comes with a cooler - in the same way Intel's 9900 and 10900 come with a cooler.

Anyway, if you're OK with this kind of temperatures and noise, you can pair 10900K with a decent $20-30 solution. No need to spend $100.

As I mentioned earlier: let's not dig into every $1. Because we may soon notice that in order to achieve review-level Zen2 performance, you need more expensive RAM and so on.
I think we agree that Intel has a higher entry cost (especially as long as there are only Z490 boards). Let's leave it there.
So you would want me to stop my contribution to the covid-19 research and cancer research ?

I hope you get what you deserve and I won't comment on what that would be. (not anything medical)
 
Last edited:

alexruiz

Platinum Member
Sep 21, 2001
2,537
155
106
On 3900X under load Wraith Prism goes into berserk mode: almost 3000 rpm and hardly something you're able to sit next to without ear plugs. And even then it probably won't be enough during hot days. Check this:
Did you even read what you quoted?
Let me help you here:

"For running stock the Ryzen 9 3900X doesn’t need a big cooler strapped on for maximum performance and it certainly doesn’t require liquid cooling. Even when enabling PBO you won’t gain much more performance by upgrading the cooler. We're not saying you shouldn’t upgrade the cooler for lower temperatures and quieter operation, simply that by doing so you won’t squeeze much extra performance.
For gamers, the bundled Wraith Prism will be even less of an issue as it’s very unlikely you’ll see all 12-cores loaded up. We observed that when gaming the fan speed was generally around 2000 RPM and much quieter than what we saw when stress testing with Blender. "


That is coming from the same article you quoted, so, yes, the Wraith Prism is perfectly enough for the 3900X.
The Prism is a noisy cooler even when paired with a 3600.
However, in a case with good ventilation you can even set the CPU fan at silent in the UEFI, and tame the noise while giving up only 3-4 °C
 

Elfear

Diamond Member
May 30, 2004
6,920
349
126
So 3900X comes with a cooler - in the same way Intel's 9900 and 10900 come with a cooler.

Anyway, if you're OK with this kind of temperatures and noise, you can pair 10900K with a decent $20-30 solution. No need to spend $100.
This is what Anandtech had to say on that topic:

"The one issue that Intel won’t escape from is that all this extra power requires extra money to be put into cooling the chip. While the Core i9 processor is around the same price as the Ryzen 9 3900X, the AMD processor comes with a 125 W cooler which will do the job – Intel customers will have to go forth and source expensive cooling in order to keep this cool. Speaking with a colleague, he had issues cooling his 10900K test chip with a Corsair H115i, indicating that users should look to spending $150+ on a cooling setup. That’s going to be a critical balancing element here when it comes to recommendations. "
 

Rigg

Member
May 6, 2020
89
161
61
Let me preface this by saying my current preference is for AMD CPU's over Intel. I've built around 30 AM4 gaming rigs vs 4 Intel in the last 2 years. I switched from a 9900k to 3900x in my personal system. I know both platforms well. I think the fanboi stuff is stupid. The only thing I'm a fanboi of is value. I'm a mercenary. I'd change allegiances in a second if I thought Intel was bringing more value to the table. While this generation is an improvement in this area, the CPU's are still too expensive. The b and h chipsets are still too limited. Even the flagship chipset is a bit limited compared to AMD 500 series.

That being said I try to be objective when looking at CPU reviews. First of all I think it's a bit unfair to price compare a 3900x to anything but the 10900f. I don't care if it's a locked 65w chip. If you can't go into bios and adjust power limits you need to find a new hobby. Overclocking the 3900x is extremely niche and suicidal if you don't know what you are doing. The fact that it's unlocked is only mildly useful to a small number of users. While the 10900f (with raised PL similar to the Intel recommended 10900k specs) still won't beat a 3900x in productivity, it will still fall right between the 3800x and 3900x in these benchmarks, and have a huge advantage in CPU limited gaming scenarios. It looks like we're about to get a pretty big leap in GPU power. I think it's a real possibility that we could see 2080 ti performance for half the price in a few months. For someone eyeing a GPU upgrade not too long down the road this could be a reasonable consideration.

As to the stock cooler argument....meh. It's pretty good for an 8 core but is kind of like using a stealth on the 3600. It'll work in a pinch but it's far from ideal. At best it's a $25 CPU discount. On a 3900x most either keep them as a backup or sell them. I mean c'mon, who uses the stock cooler on a $400 CPU? I think the "good stock cooler" argument is valid on 3700x and 3800x but falls flat on the rest of the lineup.

I also think people are being a bit disingenuous on the power usage and heat concerns that sparked this thread to begin with. The 10900k behaves somewhat similarly to Zen 2 when operating under the recommended power limits. The lighter the load the more clock you get. I don't think its fair to criticize the CPU for being capable of being cooled while OPTIONALLY drawing a butt ton of power for higher clocks. As I mentioned earlier in the thread, you can set the power limits and turbo speeds however you want on this CPU. If you don't want the power draw and the expense of exotic cooling you can turn it down. You're not losing much productivity performance and basically zero gaming performance. I'm fairly confident you could get the 10900k to game at 5.2 ghz + and throttle back in heavy workloads down to mid 4's with a pretty average cooling solution. The Intel recommended settings are not too far off of that.

The 3900x is completely uncool-able at half the power draw 10900k is capable of. Trust me....I know. I have a 3900x with a good custom loop. If I could clock the 3900x to the moon (without destroying it) and actually cool it I probably would. It wouldn't be any more power efficient than Intel either. I see this as being more of a limitation of the AMD rather than a demerit for the Intel. The 10900k gives you way more usable tuning options IMO. With the 3900x you are basically stuck with stock operation. PBO is mostly worthless. CCX overclocking is difficult to dial in and can be dangerous to the CPU if you throw too much current through it in a heavy workload. Most Zen 2 performance is gained with memory OC anyway. It's performance doesn't scale very well with clock frequency in gaming loads.
 

piokos

Senior member
Nov 2, 2018
265
82
61
Adobe is one huge suite of tools and in photoshop especially but also in Premeire several toolsets within the program can be MT enabled, ST only, or lightly threaded and driven by clock speed. I mention Phoronix tests in specific tend to have little bit more weight towards ST tools within Adobe's suite. I don't how things work out in actual loads that people who use the software use. It's worth noting because you could find yourself using more MT related functions and find yourself making a weaker choice. Its why I tend to prefer the Adobe tests of specific functions that the tubers use for their video editing.
First of all: Phoronix isn't using Adobe in tests. So I assume you may have confused them with Puget Systems, right?
Anyway: yes, an Adobe-based benchmark will mix ST and MT tasks. Just like any good benchmark should. Just like actual user would.

And of course that mix is representative just for some workflows and just for some users. No other way.
Furthermore, good reviews will also include results for particular components, so that - if you know what is more representative for your workflow - you can focus just on them.
And similarly, Anandtech provides results in particular components of e.g. SPEC suite - so you can look at what's most relevant for you:

I don't know why you're trying to undermine this kind of testing. Do you think it makes less sense than Cinebench? :)
Well you know except basically their whole k lineup and specifically this monster that pointed out can flash up to as you put it NEARLY 400W!!!!!
It's worth noting that two CPU's at similar prices, one doesn't require you to spend a dime to get advertised stock performance and the other requires you to make the right purchase to in a shock to me, at stock settings, be able to take momentary jumps to NEARLY 400W!!!!
Exactly, momentary. And extremely rare. We're talking about outliers in the Phoronix test.
If you throw them away, either by setting lower PL2 or not providing enough power, impact on total performance would be negligible.
But if your system can handle it, why not?
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
15,494
4,279
136
Yeah, but won't have decent perf/cores till 2022? Unless the specs are wrong on Adler lake, it looks like some specialty chip - maybe mobile still.
If it's desktop, the 8 small cores make no sense.
Alder Lake does look strange, but it keeps showing up as Intel's Q4-ish desktop chip for 2021. If Alder Lake-S isn't Intel's desktop mainstay at that point, Intel will be stuck with Rocket Lake-S as their flagship desktop product until whenever they manage to launch Meteor Lake. And that might not happen until 2022 or 2023!

Well, it's not that difficult to find temperature figures under load for Zen2 SoCs. You can start from the link I posted.
It does seem difficult for you to understand the implication of those temperatures. Those are hotspot temps. Matisse isn't going to put out a lot of power at stock. It's fundamental to the design, and it has been measured multiple times. Throwing bulk cooling capacity at Matisse doesn't have the same effect as it does on 12nm or 14nm CPUs from any vendor. Matisse (and all Zen2 chips) use dense libraries. 6.5T and all that. There's a network of ~1000 temp sensors on an individual Matisse die, and it reports the highest one. That's how it ensures safe operation. It doesn't mean it's throwing off a huge heat flux or that it needs a 280mm AiO. They're nice to have, and they can help reduce hotspot temps, but unless you're obsessed with keeping your temps below 70C or something, you can "get by" with an air cooler.

Really nothing I can add here.
You don't say.

And I did say 3900X remains faster on average in that test set. And that Intel wins in particular tests.
Not sure why anyone's comparing the 10900k to the 3900x, when the total cost of acquiring a 10900k is much closer to that of the 3950x. $530 for the chip, $150 (or more) to cool it . . . meanwhile a 3950x costs $710 at NewEgg. With its own cooler.

Intel making CPUs for OEMs and AMD making CPUs for DIY enthusiasts.
OEMs don't like CPU shortages due to wafer supply problems. No they do not.
 

piokos

Senior member
Nov 2, 2018
265
82
61
Adobe is one huge suite of tools and in photoshop especially but also in Premeire several toolsets within the program can be MT enabled, ST only, or lightly threaded and driven by clock speed. I mention Phoronix tests in specific tend to have little bit more weight towards ST tools within Adobe's suite. I don't how things work out in actual loads that people who use the software use. It's worth noting because you could find yourself using more MT related functions and find yourself making a weaker choice. Its why I tend to prefer the Adobe tests of specific functions that the tubers use for their video editing.
First of all: Phoronix isn't using Adobe in tests. So I assume you may have confused them with Puget Systems, right?
Anyway: yes, an Adobe-based benchmark will mix ST and MT tasks. Just like any good benchmark should. Just like actual user would.

And of course that mix is representative just for some workflows and just for some users. No other way. But it's still checking performance is very popular software.
Furthermore, good reviews will also include results for particular components, so that - if you know what is more representative for your workflow - you can focus just on them.
And similarly, Anandtech provides results in particular components of e.g. SPEC suite - so you can look at what's most relevant for you:

I don't know why you're trying to undermine this kind of testing. Do you think it makes less sense than Cinebench? :)
Well you know except basically their whole k lineup and specifically this monster that pointed out can flash up to as you put it NEARLY 400W!!!!!
It's worth noting that two CPU's at similar prices, one doesn't require you to spend a dime to get advertised stock performance and the other requires you to make the right purchase to in a shock to me, at stock settings, be able to take momentary jumps to NEARLY 400W!!!!
Exactly, momentary. And extremely rare. We're talking about outliers in the Phoronix test.
If you throw them away, either by setting lower PL2 or not providing enough power, impact on total performance would be negligible.
But if your system can handle it, why not?
 

piokos

Senior member
Nov 2, 2018
265
82
61
It does seem difficult for you to understand the implication of those temperatures. Those are hotspot temps. Matisse isn't going to put out a lot of power at stock.
They affect the max boost if they hit the 95*C limit. That's the whole point of measuring temperature.

You're basically saying that high temperatures aren't a problem on Zen2, because it uses less power. But they somehow are a problem on Comet Lake, because there's more heat.
I mean: what's going on? :)
Both processors will throttle when they hit the limit. And when that happens, 3900X will pull 140W and 10900K will pull 250W.
Not sure why anyone's comparing the 10900k to the 3900x, when the total cost of acquiring a 10900k is much closer to that of the 3950x. $530 for the chip, $150 (or more) to cool it . . . meanwhile a 3950x costs $710 at NewEgg. With its own cooler.
3950X comes without a bundled cooler.
AMD recommends 280mm AiO. The same kind of cooling will also be sufficient on 10900K.
So it's still $710 for 3950X and $530 for 10900K ($510 for 10900KF).
 

DrMrLordX

Lifer
Apr 27, 2000
15,494
4,279
136
They affect the max boost if they hit the 95*C limit. That's the whole point of measuring temperature.
142W chips don't hit the 95C limit too easily.

3950X comes without a bundled cooler.
Yeah I keep forgetting that.

AMD recommends 280mm AiO. The same kind of cooling will also be sufficient on 10900K.
Oh yeah?


and


That recommendation did make me wonder just how necessary liquid cooling is, and I've got testing results for the 3950X running with both a Wraith Prism cooler as well as an extremely capable NZXT Kraken X62. You might be surprised how little difference there is in performance between the two at stock settings, though. The X62 dropped the maximum temperature by 10-15C, depending on the workload, but only ended up being about 1-2 percent faster.
You can run one with Wraith Prism. It's still a 142W chip. That lame little 2-heatpipe cooler isn't quite enough, but it's funny watching it try.
 

ASK THE COMMUNITY