Discussion Comet Lake Intel's new Core i9-10900K runs at over 90C, even with liquid cooling TweakTown

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
20,137
7,637
136
Those power consumption numbers would have severely limited Zen 2, and AMD would have probably resorted to what Intel is doing now with Rocket Lake by back-porting the architecture onto GLobal Foundries' 12LP process node.
What in blazes are you going on about. This is a 10900k thread, and comparing power usage numbers on OLD CPUs (1 to 2 generations back) has no place here.
 

amrnuke

Senior member
Apr 24, 2019
797
883
96
So games were unplayable until March 2017?

And what about 6 cores? Are games playable or not? Or half of the time?
4/4 versus 6/12 COULD make a difference between playable and unplayable in some games. But clearly his blanket statement was wrong.

I assume he's referencing 1% lows versus average as a barometer for perceived stuttering, but it doesn't bear out in many games. KitGuru's reviews show the 3300X has better average and 1% low FPS than the 3600 and 1600AF in some games. They also show that 3600X 1% lows essentially same as 3900X 1% lows (even better sometimes, depending on game).

Obviously, below 6 threads, that's a tough place. But I feel like most of the reviews for 6+ thread processors show that 1% lows aren't scaling exactly well with core/thread count, and that there are other factors at play.

Right now the gaming sweet spot is probably 6-8 threads. The rest is clock and memory and ramp times and boost duration. And, importantly, GPU.

But I think as we see a real PC port of a CPU to the consoles, we're going to see more use for more cores and threads, but it won't make a 6C/6T 9400 useless or unplayable, just less smooth.
 

Topweasel

Diamond Member
Oct 19, 2000
5,323
1,498
136
What in blazes are you going on about. This is a 10900k thread, and comparing power usage numbers on OLD CPUs (1 to 2 generations back) has no place here.
Also its full system. We know that the intel chipset and boards generally use less power especially idle power usage. So its not apple to apple comparisons. But yeah this is the same activities of times around the 1800x or 2700x where someone would find the one outlier both in reviews and in software to prove whatever statement they were working against was wrong.

Oddly enough we have a package power usage on Anandtech right now.



Significant extra power usage over the the 2700x and 3900x for the 9900k.
 
  • Like
Reactions: Tlh97

Ajay

Diamond Member
Jan 8, 2001
6,821
2,197
136
2006 Intel stumbled because a reasonable engineering gamble to leverage an overwhelming process advantage on a radical architecture did not pan out.

2020 Intel:
- Has a process disadvantage, will likely never hold an advantage ever again
- Continues to stick to stale design practices and methods which were advantages with a process advantage, but are now liabilities
- Engaged in a four year series of politicized layoffs which in turn drove out yet more engineering talent
- Still thinks it can rely on its name to attract talent with 30% less pay than all the other companies and therefore fails to attract new talent for boots on the ground
- Is under attack on every market and is outclassed in most of them

My two cents after 16+ years in the industry... works out to 1/8 cent a year.
Thank you very much. That whole bit with middle manager 'mentors' forming their own little fiefdoms within Intel is persisting even with upper management trying to break that up. Politics over merit must suck the life out of good engineers.
Non-competitive pay, wtfbbq; Intel has a shed-ton of money???
 

AtenRa

Lifer
Feb 2, 2009
13,303
2,114
126
Because AMD made a big fuss of using "7nm" for their chips, AMD fans followed that and suddenly everyone on PC forums became a semiconductor expert.

In fact until 2017 there weren't that many discussions about CPUs at all. Some PC review websites weren't even reviewing them that often (e.g. TPU).
Intel's 4 cores somehow worked for pretty much everyone. 1-2 years later - despite still gaming up to 4K, using the same OS and software - suddenly 4 cores became good only for basic office work and browsing the web. :)
You have to go back in 2012 and see how Intel marketing was making a BIG fuss of using their 22nm 3D process ;)

Intel's 22nm FinFet was the second coming of Jesus technology back then. They used every marketing technic available to promote their 22nm CPUs.


 

Zucker2k

Golden Member
Feb 15, 2006
1,053
454
136
Also its full system. We know that the intel chipset and boards generally use less power especially idle power usage. So its not apple to apple comparisons. But yeah this is the same activities of times around the 1800x or 2700x where someone would find the one outlier both in reviews and in software to prove whatever statement they were working against was wrong.

Oddly enough we have a package power usage on Anandtech right now.



Significant extra power usage over the the 2700x and 3900x for the 9900k.
This is getting a bit silly when people think they have a "gotcha" and going bonkers over it. Even AMD HEDT CPUs stop at 180 watts on package power. The 12 Core Intel is at 140 watts. If you check my posting history to the time Zen+ was released you'd realize we've gone over this debate before. I refer you to that period.

What in blazes are you going on about. This is a 10900k thread, and comparing power usage numbers on OLD CPUs (1 to 2 generations back) has no place here.
Nevermind.
 

DrMrLordX

Lifer
Apr 27, 2000
15,494
4,281
136
It's quite obvious from the link you posted that the 2700x is TDP restricted.
It's supposed to be TDP restricted. That is the whole point of the TDP rating, to make sure you know either a). how much power it consumes and/or b). how much heat your cooling solution should be able to handle at a minimum. Having a CPU that burns more power as you increase the capacity of the cooling solution is not always a good feature! Nearly every desktop CPU ever sold has functional TDP restrictions of one sort or another. Only a madman wants a CPU that overclocks itself according to rules mostly set by some motherboard OEM.
 

aigomorla

Cases and Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
17,897
775
126
They need to give us more PCI-E lanes.

With nVME's now each taking 4 of those precious lanes, the standard is insufficient for extra's even for high end gaming as Unreal 5 Engine hints at Mandatory nVME.
And games aren't getting any smaller in size either.

Lets honestly look at it.. a 4TB nVME costs roughly 1000 dollars.
While getting a HEDT with extra PCI-E lanes + Bifuication card to have 4x1TB nVME will probably cost you a bit less then that, and still end up with extra PCI-E Lanes. (see what i mean?) What is the point in a non HEDT for high end gaming now?

And i dont think these cpu's are mostly targeted at just gaming, but it would be nice if they could double in other duty, which would require more addon cards, that translate to more PCI-E Lanes.

Otherwise these cpu's honestly seem pointless and useless outside gaming, and in a gaming PC, you do not need MOAR CORES.
So your better off going HEDT which has lots of PCI-E lanes, and can double in both gaming + productivity, and support a whole cache of addon cards with extra nVME's.
 
  • Like
Reactions: Ajay

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
20,137
7,637
136
They need to give us more PCI-E lanes.

With nVME's now each taking 4 of those precious lanes, the standard is insufficient for extra's even for high end gaming as Unreal 5 Engine hints at Mandatory nVME.
And games aren't getting any smaller in size either.

Lets honestly look at it.. a 4TB nVME costs roughly 1000 dollars.
While getting a HEDT with extra PCI-E lanes + Bifuication card to have 4x1TB nVME will probably cost you a bit less then that, and still end up with extra PCI-E Lanes. (see what i mean?) What is the point in a non HEDT for high end gaming now?

And i dont think these cpu's are mostly targeted at just gaming, but it would be nice if they could double in other duty, which would require more addon cards, that translate to more PCI-E Lanes.

Otherwise these cpu's honestly seem pointless and useless outside gaming, and in a gaming PC, you do not need MOAR CORES.
So your better off going HEDT which has lots of PCI-E lanes, and can double in both gaming + productivity, and support a whole cache of addon cards with extra nVME's.
Intel needs more PCIE lanes. The point is, AMD has them, in desktop and a LOT more in HEDT. And 8 cores with AMD gets you within less than 5%, and thats Zen2. Zen 3 should be the king.

THATS why I say this whole release is a joke.
 

RetroZombie

Senior member
Nov 5, 2019
410
297
96
Yes, they sat on up to 4 cores in the consumer lineup. But they worked on other things.
For example.................. new motherboards?

Blender Gooseberry: (System Power Consumption)
Ryzen R7 2700x = 205 watts
Core i5 8400 = 117 watts
Blender Gooseberry: (Render Time)
Ryzen R7 2700x = 2074 seconds
Core i5 8400 = 4086 seconds

Your claim is pretty much AMD was lazy because they didn't want to go bankrupt (a problem Intel did not face).
And yes, Intel Tick-Tock was also only possible because multiple designs were worked on at the same time, in parallel.
And what designs have they delivered? according to piokos:
AMD INTEL CPU division did pretty much nothing.
 

lobz

Golden Member
Feb 10, 2017
1,130
1,013
106
4/4 versus 6/12 COULD make a difference between playable and unplayable in some games. But clearly his blanket statement was wrong.

I assume he's referencing 1% lows versus average as a barometer for perceived stuttering, but it doesn't bear out in many games. KitGuru's reviews show the 3300X has better average and 1% low FPS than the 3600 and 1600AF in some games. They also show that 3600X 1% lows essentially same as 3900X 1% lows (even better sometimes, depending on game).

Obviously, below 6 threads, that's a tough place. But I feel like most of the reviews for 6+ thread processors show that 1% lows aren't scaling exactly well with core/thread count, and that there are other factors at play.

Right now the gaming sweet spot is probably 6-8 threads. The rest is clock and memory and ramp times and boost duration. And, importantly, GPU.

But I think as we see a real PC port of a CPU to the consoles, we're going to see more use for more cores and threads, but it won't make a 6C/6T 9400 useless or unplayable, just less smooth.
Please don't put words in my mouth, thus fueling the raging fire this dude unleashes here. I didn't make a blanket statement at all. PLEASE read my original post.
 

amrnuke

Senior member
Apr 24, 2019
797
883
96
Please don't put words in my mouth, thus fueling the raging fire this dude unleashes here. I didn't make a blanket statement at all. PLEASE read my original post.
You said: "Going from 4 cores to more cores gives you... guess what... the difference between being playable and unplayable. You won't see a big difference in average fps, but a tremendous difference in minimum fps. If you really insist, we can do this all day - all night."

You didn't expound upon that, which is why I said "I assume" instead of actually putting words in your mouth, but whatever.

What is clear is that the 4 cores of 3100 and 3300X seems mighty capable of keeping up with more cores of the 3600 in a large portion of games tested at KitGuru and other reputable sites.

So I'm wondering if you could explain what you meant by your statement. I mean, of course if the game won't run on 4 cores, it might run on 6 cores. But your statement seemed very much like a one-size-fits-all statement, not a special situation statement.
 

Topweasel

Diamond Member
Oct 19, 2000
5,323
1,498
136
I don't think he meant it as one size fits all. But as early as late 2016 we were seeing games that had very visible performance issues 4c4t. During the review run on the 3300x there were several that tested it against the 3600x. Again we are now seeing games where 4t8c might not be enough. Right now averages might be close but lows drop significantly.

A 3300x is great for showing what a couple of years of competition can do with pricing. But new games are also showing us what the lack of confidence competition has done to game development. The new consoles will push that even farther. They are certainly starting to use more resources as those resources normalize on most equipment.

The end result is even today you can see that difference meaning playable to unplayable (though that can be a bit subjective).
 

NobleX13

Member
Apr 7, 2020
27
16
36
I can't say that I am surprised by these power consumption numbers. Given the thermal characteristics of the 9th-gen parts this seems right in line for Comet Lake. I will also say that this does not bode well in my mind for Rocket Lake at all. Backporting Willow Cove to 14nm is not going to do Intel any favors for energy efficiency, either.
 

Topweasel

Diamond Member
Oct 19, 2000
5,323
1,498
136
Power consumption is not as bad as I thought, still bad though. And this is a classic paper launch or at best very limited supply I'm betting this is a super binned chip to try and save face.
The trick is everyone is going waaaaaay out of their way in the review to get rid of any "specializations" any of the boards will do and even notating what boards they got that defaulted to supposedly out of spec settings. I say supposedly because its been suggested these settings are what Intel wants to see reviews with and wants the enthusiast boards to set.

So the result of the reviews is that power isn't so terrible accept for the first 30 seconds of the test where the CPU is hitting 200w + (which is a little better than I thought).

This makes the power usage more manageable on regular coolers and keeps the CPU from slamming against the thermal limits. Their working theory being 30 sec of 200w won't thermally saturate a 120w-140w cooler.

But again bears the question what most people will see when building their systems. Specially if they don't know what their MB choice will have set as the default settings with this CPU.

Also one of the reviewers I think Hardware canucks brought up a good point about PSU choice since for periods of time the power usage can nearly double TDP rating, how hiding such a significant max power usage can catch people off guard. For the most part this doesn't really matter, people aren't going to skimp on PSU with this CPU and chances are they were always going to get the $150 or $200 cooler. A very large portion of these people are going to know the bios settings and will be comfortable with all of the "non-spec" stuff turned on. But its not everyone and I'll go back to my issues with the 90w rating and 160w usage of the 9900k. Even a little bit with AMD and their 105w rating and ~140w usage (which also can run unlimited at this setting) even though they are much more clear of the power usage (65w means 85w usage, 90-105w means 140w usage). I know most of this is for OEM's so they can still provide a CPU in spec where they can tweak settings to fit their cooling solutions. It's a bad practice that increases confusion, will lead to bad encounters, and is really a disingenuous.
 
  • Like
Reactions: lightmanek

piokos

Senior member
Nov 2, 2018
266
82
61
For example.................. new motherboards?
For example... laptops became slim, light with workday-long battery life. Much faster too.
For example... in 2011 high performance Xeons topped at 10 cores, but in 2016/early 2017 (before EPYC launched) it was 24 cores.

Please, don't tell me you're only noticing what happens in the gaming desktop segment...
 

piokos

Senior member
Nov 2, 2018
266
82
61
Power consumption is not as bad as I thought, still bad though.
Higher than what dense TSMC 7N needs. But actually better than many expected.
A lot of people on this forum, including those who criticize Comet Lake the most, probably have cooling solutions easily capable of handling at least 10700K.
And this is a classic paper launch or at best very limited supply I'm betting this is a super binned chip to try and save face.
These -K chips are by definition limited supply and super binned. Nothing changes.
And for now we only have that in retail, since OEMs have priority for the OEM SKUs.

That said, it's far from "paper launch".
I'm writing from Poland and all -K chips are available from the largest Intel partners.
It seems 10700K is the one easiest to get. I could have one tomorrow morning. 10900K and 10600K - delivery in 1-3 days.

Prices are also in-line with MSRP. at the moment 10900K costs ~$50 more than 9900K.
 

Rigg

Member
May 6, 2020
89
161
61
The trick is everyone is going waaaaaay out of their way in the review to get rid of any "specializations" any of the boards will do and even notating what boards they got that defaulted to supposedly out of spec settings. I say supposedly because its been suggested these settings are what Intel wants to see reviews with and wants the enthusiast boards to set.

So the result of the reviews is that power isn't so terrible accept for the first 30 seconds of the test where the CPU is hitting 200w + (which is a little better than I thought).

This makes the power usage more manageable on regular coolers and keeps the CPU from slamming against the thermal limits. Their working theory being 30 sec of 200w won't thermally saturate a 120w-140w cooler.

But again bears the question what most people will see when building their systems. Specially if they don't know what their MB choice will have set as the default settings with this CPU.

Also one of the reviewers I think Hardware canucks brought up a good point about PSU choice since for periods of time the power usage can nearly double TDP rating, how hiding such a significant max power usage can catch people off guard. For the most part this doesn't really matter, people aren't going to skimp on PSU with this CPU and chances are they were always going to get the $150 or $200 cooler. A very large portion of these people are going to know the bios settings and will be comfortable with all of the "non-spec" stuff turned on. But its not everyone and I'll go back to my issues with the 90w rating and 160w usage of the 9900k. Even a little bit with AMD and their 105w rating and ~140w usage (which also can run unlimited at this setting) even though they are much more clear of the power usage (65w means 85w usage, 90-105w means 140w usage). I know most of this is for OEM's so they can still provide a CPU in spec where they can tweak settings to fit their cooling solutions. It's a bad practice that increases confusion, will lead to bad encounters, and is really a disingenuous.
I noticed this too. I suspect Intel may have purposely asked this to be a highlight in the reviewer guides they sent out with the review kits. I can't think of any other reason so many reviewers would have made it a point. The power limit variation by motherboard was a small controversy when 9900k reviews launched. LTT used a board that enforced the Intel suggested power limits while everyone else didn't. Intel appear to have chosen an Asus board that follows the Intel power limit guidance for the review kits they sent out. Whether the reviewers highlighted this on their own accord, or Intel asked them to highlight it, I think it's a good thing. The power limit games on the motherboards can make benchmarks more of a motherboard default settings benchmark than a stock CPU benchmark. The fact that reviewers were pointing this out was refreshing IMO.

My primary take away from reviews was overclocking (or running these CPU's without limits) looks idiotic. OC and MCE benchmarks gained almost no gaming benchmark advantage, and only a moderate compute benchmark advantage, despite a massive increase in heat and power usage. A 10900k 5.2 ghz 1.3v OC pulled 28% more power than a stock 64 core 3990x. :oops: The 10900k stock was in line with an overclocked 9600k or 1600AF in the same Gamers Nexus review.

I'm curious to see if the non k SKU's can keep up thermally to these without the sanded dies. Running a non k i7 or i9 (with similar power limits to the K set in bios) might make an interesting gaming rig combo with a good air cooler, and a good H470 or entry level z490.
 
Last edited:

Rigg

Member
May 6, 2020
89
161
61
Adding extra cores will not help Intel with their supply shortage.
To be fair the 3900x and 3950x took a month or so to be widely available after release. While not what I would consider a paper launch they were a bit hard to find initially. The only significant comet lake stock available in the US right now seems to be 10700k and 10400. In contrast every Zen 2 SKU was available on launch day and (with the exception of the R9's being a bit harder to find at first) widely available ever since.
 

ASK THE COMMUNITY