Discussion Comet Lake Intel's new Core i9-10900K runs at over 90C, even with liquid cooling TweakTown

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
Yes,yes it is,1.1W of power draw is amazeballs,best CPU ever.
HQdSgoZ.jpg


The problem is users are going to get burned: the same users that know little about overclocking and cooling are the ones tempted by motherboards with disabled limits and ridiculous stock voltage settings to "safely" enable MCE by default.
But those are also the people that will never run an AVX heavy stress test for the very simple reason that they don't even know that it's a thing.
They are going to run battlefield or fortnite and that's it,and the CPU is going to run cool and quiet even if it draws more power than necessary.
Your points on wattage are rediculous. The average wattage for a 10900k is 254 if pl2 is used. The peak was 380 watt.
No max draw is not average draw,the CPU is optimized to such a degree that it alternates between high and low extremely fast.
You would have to lock the CPU to Pl2 indefinitely ,which would be a pretty heavy O/C to get 250W ,and even then it would not be the average because speed step would still be doing its job.
It would be the power draw during testing or running heavy loads only.
380 is only possible with every restrain removed.
8%20-%2010900K%20Power%20yC_575px.png


Also for everybody having an issue with phoronics ,computer base has an average and they only use software that heavily favors ryzen,all their tests are 3d rendering, video transcoding, archiving, photoscan and y-cruncher.
Completely ignoring the gaming win.
On single the 10900k is 8% ahead of the 3900x while on multi the 3900x is 10% ahead of the 10900k, a whopping 2% win for the 3900x while it has 20% more cores...so basically still an 18% loss compared to the ~23% of phoronics.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
Cinema 4D is not "widely used".
And yes, it's just as valid - for gamers - as Blender. Which is: not at all.
Okay, citation required. Are you saying not widely used, like lets say Chrome, or not widely used like Premiere, Matlab, Autocad? Because if you are just being obtuse with the former you'd be right but that isn't what was suggest. If its the later, what's your basis on that? Because I am pretty sure your wrong.

That said you are right. It's not valid for Gamers. Cept its single core test can be a good IPC tester which does have an impact in some games, and the MT test is a good way to performance as power usage grows. Won't tell you right away who will win in Fortnite, but isn't divorced from it either.
 

piokos

Senior member
Nov 2, 2018
554
206
86
Okay, citation required. Are you saying not widely used, like lets say Chrome, or not widely used like Premiere, Matlab, Autocad? Because if you are just being obtuse with the former you'd be right but that isn't what was suggest. If its the later, what's your basis on that? Because I am pretty sure your wrong.
Definitely not as widely used as Premiere, Matlab or AutoCAD.

Doing a similar calculation as above, using AutoCAD sales ($950M) and license ($1700) we get around 560k users.
And I'm not going to dig into this any further, i.e. that AutoCAD has a massive portfolio of (cheaper) volume licenses - especially in education.

Some people here like to undermine the importance of Photoshop benchmarks - especially when they show a particular brand of processors in worse light.
I bet they wouldn't want me to make a similar estimate of Adobe clients...
That said you are right. It's not valid for Gamers. Cept its single core test can be a good IPC tester which does have an impact in some games, and the MT test is a good way to performance as power usage grows. Won't tell you right away who will win in Fortnite, but isn't divorced from it either.
"Power usage"? :)

Seriously, suggesting vague relations like "it's a good IPC tester that has impact on games" leads nowhere.
Cinebench is a specialized benchmark, nothing more. Just like any specialized benchmark, it can be used to:
1) compare different CPUs in that exact type of load - which is the origin of Cinebench - a benchmark for (potential) Cinema 4D buyers,
2) compare similar CPUs to check how they relate to each other or how a particular architecture evolved over time.

Millions of gamers buy their CPUs based on how quickly they would render a scene in Cinema 4D.
Of course there is some correlation: a CPU that leads in Cinebench has a high probability to lead in Blender as well. But that's about it.

People who buy a CPU for gaming, browsing web, editing multimedia, engineering/scientific simulations, spreadsheets or databases should use benchmarks as similar to the target use case as possible - ideally built around the same software.
The idea to buy a gaming CPU based on a rendering benchmark is just wrong. I hope this doesn't need explanation.

But more importantly: some people criticize testing suites because they show an average result from different loads.
And I don't know if this is an intellectual issue or brand loyalty. It's really sad either way.
 
Last edited:

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
A financial report of parent company shows that Maxon had a revenue of 25.3M EUR in 2017:

If we assume a similar revenue today and look at their cheapest annual license plan (720 EUR) we get a rough estimate of ~35k customers.
35k customers is a lot of customers. You are talking about a program that has niche market to begin with. 35k in customers for this type of product is a lot.
 
  • Like
Reactions: spursindonesia

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
Yes,yes it is,1.1W of power draw is amazeballs,best CPU ever.
HQdSgoZ.jpg



But those are also the people that will never run an AVX heavy stress test for the very simple reason that they don't even know that it's a thing.
They are going to run battlefield or fortnite and that's it,and the CPU is going to run cool and quiet even if it draws more power than necessary.

No max draw is not average draw,the CPU is optimized to such a degree that it alternates between high and low extremely fast.
You would have to lock the CPU to Pl2 indefinitely ,which would be a pretty heavy O/C to get 250W ,and even then it would not be the average because speed step would still be doing its job.
It would be the power draw during testing or running heavy loads only.
380 is only possible with every restrain removed.
8%20-%2010900K%20Power%20yC_575px.png


Also for everybody having an issue with phoronics ,computer base has an average and they only use software that heavily favors ryzen,all their tests are 3d rendering, video transcoding, archiving, photoscan and y-cruncher.
Completely ignoring the gaming win.
On single the 10900k is 8% ahead of the 3900x while on multi the 3900x is 10% ahead of the 10900k, a whopping 2% win for the 3900x while it has 20% more cores...so basically still an 18% loss compared to the ~23% of phoronics.
So taking your side for a moment.... If your application finishes its job in 56 seconds, it will only draw 200 watts (or so) compared to the AMD chip at 142 watts. But if it goes longer than 56 seconds, your CPU downclocks to save power and all benchmarks and productivity go out the door.

And your last sentence, you are so focus on anything that makes Intel look good. Intel is a little better on gaming, we know that. The world does not revolve around gaming. And as for the MT side, who cares about 10 cores vs 12. The less expensive CPU with a smaller package power beats it in the 56 second category and get womped on long range performance.

You just can't do anything to prove your point, you keep proving mine.
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
Definitely not as widely used as Premiere, Matlab or AutoCAD.

Doing a similar calculation as above, using AutoCAD revenue ($950M) and license ($1700) we get around 560k users.
And I'm not going to dig into this any further, i.e. that AutoCAD has a massive portfolio of (cheaper) volume licenses - especially in education.
Yeah but looking at size of market and where they sit. What competes with it? What's it hold of the pie. 35k in customers per year is still a lot of customers.
Some people here like to undermine the importance of Photoshop benchmarks - especially when they show a particular brand of processors in worse light.
They wouldn't want me to make a similar estimate of Adoe users...
That's pointed at me. I don't undermine photoshop benchmarks. I undermine a single companies package of self chosen tasks in an all in one benchmark with their own weighting in value. I prefer benches of testing functions on their own and letting people based on those tests decide how it impacts their functions. Certain things like exports get devalued in those because they used to be an overnight task. It took 2 hours so you did it at the end of the day. Now that its a 27 vs. 30 minute task is that still the case? or would it be nice to minimize downtime and be able to review the export same day? You don't get to choose that with the pre packaged Puget test.
That said you are right. It's not valid for Gamers.
It's not a 1 to 1. But only because even at heavy CPU loads FPS is still heavily tied into frame page feeds from the CPU which will always enjoy clocks over actual processing power. But that's because its about all a Jaguar processor can do. By 2022, 2023 at the latest we will be looking at CPU performance in gaming in whole different light. But that is as a whole, we will see more than that much sooner.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
A financial report of parent company shows that Maxon had a revenue of 25.3M EUR in 2017:

If we assume a similar revenue today and look at their cheapest annual license plan (720 EUR) we get a rough estimate of ~35k customers.
It's estimated that 12,000+ companies use Cinema 4D.

They also offer free 18 month licenses to educators and students.

The cost of R18 back in 2017 was $1000 to $3700, with annual pricing running from $250/yr to $650/yr, and perpetual (full purchase, no subscription) R17 -> R18 upgrade prices of $400 to $1000 depending on specific release.

Go ahead and try to sort out customer base #s based on revenue in 2017.
 
Last edited:

ehume

Golden Member
Nov 6, 2009
1,511
73
91
I wouldn't assume its going to turbo much higher than a 9900K with out some really expensive cooling.

This isn't about temperature and we need to stop thinking about it as hot vs. not hot. Its about the cooling requirements and what temp it will run at, at that spot. Now in regards to the 10900k, we look at the power usuage of lets say the 3900x or 3950x. They don't actually use max power at max load. They both top out at about 8c (16t) full usuage and that's because of other limits outside just temp. That's were core usuage, core speed, and power usuage hits its max and it starts leveling off after that.

Anyways. As the cores try to clock higher for having less of them used. I suspect it will be much the same unless MCE is turned on. So at defualt settings at 6 core usuage we could be looking at max power usuage. That max usuage is looking north of 200w, specially if you are able to use TVB. But the problem with TVB is its use is basically going to be hotspoting an area, making TVB limited on that point. But to even hit that the CPU has to be running rather cool. That's where the problem comes in. If its thermal throttling with a 280 AIO, will a 360 work. Will we need custom loops. I think we can mostly agree that a D15 or H100i are the on the extreme end of coolers that we can "recommend" for desktop chips as 99% of the CPU's out there its way overkill. But I think there is a point that if this chip starts requiring $150-200 cooling or more, to run in spec, without slamming up against the thermal limits of the chip, there is problem, or at least can be mocked for being "hot".

In the past we have gotten review where the reviewers went straight to AIO's because Intel doesn't ship coolers with their CPU's and not accounting for that when comparing to other CPU's. With the 8700k it was as it was brought up there probably a wash. Less so on the 9900k. But I think specially with this guy it will be important for the reviewers to figure out what "adequate" cooling will be like comparitively.
Back when I compared heatsinks, I ran an 8700k at 5 GHz (LinX 0.6.5 = Linpack + AVX2; no reduction for AVX), all 6 cores. Top air cooling held Tcore averages to 80's, with only spikes to 90's (spike to 100c disqualified that test). Was that a wash in gaming?
 

Topweasel

Diamond Member
Oct 19, 2000
5,436
1,654
136
Back when I compared heatsinks, I ran an 8700k at 5 GHz (LinX 0.6.5 = Linpack + AVX2; no reduction for AVX), all 6 cores. Top air cooling held Tcore averages to 80's, with only spikes to 90's (spike to 100c disqualified that test). Was that a wash in gaming?

That's not what I was getting at. Intel in the 7700k stopped reporting what power usage was during different loads. But they didn't do this for the the 7700k to hide anything. It was to set the table (like denouncing benchmarks as a solution for evaluating CPU) for future CPU's. No one really noticed much more power usage overclocked than a 6700. A 8700 even with MCE enabled probably used upwards of 110w to 120w (Tom's has it at 116w). That I didn't look at just didn't figure it would be to much more than a 7700k. 15-25w At max power usuage over TDP isn't great and probably a bit more than a wash (thinking 10-15w). Someone getting a cooler of a 95w Cooling isn't going to get dramatically different performance than what its rated for. The 9900k was a bit change MCE started getting turned on, unlocked PL2, no Tau. With people seeing 160w-180w, with default board settings, with a strong suggestion that Intel wanted these on and that their communication with reviewers suggested that its intended to run with these settings on. So big difference. So now we are looking at 190w-200w and spikes up to 250w. On a CPU rated for 125w TDP.

Now Intel is pushing for stricker rules to follow (so their CPU is closer to spec) and did a ton of work to make sure the CPU could shed heat quicker because it shoots up so high. But only one board manufacturer is listening. The rest are using defaults that look closer to the general 9900k defualts. We are leaving the realm of just possible cooling problem scenario's unless people get a right sized cooler for 200w, but also with such a large difference in actual power difference you are going to run into power issues if you are not leaving yourself enough headroom
 

Oneto

Member
Apr 2, 2017
65
72
91
high heat usually means more fan noise and that's a nuisance to a lot of people. I don't mind hearing the fans running while I'm running folding@home but any other time I want my PC quiet.
 

reqq

Member
Feb 26, 2020
31
35
61
Sigh, then stop spreading BS. At "stock", to which almost no motherboard adheres to, it downclocks on any all core load after PL2 time expires (56s) to 125w (4.2-4.3 ghz):


And at stock settings no 105w Ryzen CPU goes over 144W, ever. That includes the 3950x


Err.. so you agree with me. My numbers are all system load from the wall during full all core load.. Your numbers are the same but just the cpu ;)
 

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,858
136
10900K is hot! 10900K is cold! 30x chips tested by der8auer in the same conditions: voltage, cooling, motherboard. Results indicate up to 20% difference in power usage with the same Vcore, up to 13C temp delta depending on package characteristics.

Maybe it's time to stop quarreling and start learning. Very interesting insights.

10900K-TempsAndPower.jpg
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
10900K is hot! 10900K is cold! 30x chips tested by der8auer in the same conditions: voltage, cooling, motherboard. Results indicate up to 20% difference in power usage with the same Vcore, up to 13C temp delta depending on package characteristics.

Maybe it's time to stop quarreling and start learning. Very interesting insights.

View attachment 22333
232 watt us low ? WOW, still a power hog. And 79c is cool ? I only see that on my 3900x when its 85f in the house.

They are still hoi and power hungry. Just some hotter and more power hungry.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,858
136
232 watt us low ? WOW, still a power hog. And 79c is cool ? I only see that on my 3800x when its 85f in the house.
At least make the effort to consider the context: 5.1Ghz all-core, 4266Mhz RAM, peak power not average, Intel temperature reporting versus AMD temperature reporting methodology.

I wrote a post to highlight just how big the variance in power and temps can be on the same SKU, the first post after mine was the same old reaction about Intel vs. AMD. I guess it's not time to stop quarreling after all.
 
  • Like
Reactions: Elfear and ondma

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
What cooler does he have on that ? no captioning that I could get working. And does it stay at 5.1 for more than 56 seconds ? All cores ? Overclocked ? water-cooling ? too many questions, I need answers.

P.S. I fixed the typo above, its a 3900x that runs at 70c with ambient at 77f, 4 ghz, all cores 100%load 24/7
 

piokos

Senior member
Nov 2, 2018
554
206
86
P.S. I fixed the typo above, its a 3900x that runs at 70c with ambient at 77f, 4 ghz, all cores 100%load 24/7
Some time ago I asked if it would be possible for you to write units properly (well, it clearly is).
But the idea of putting temperatures in different units in the same sentence is just bizarre.

And yes, your 3900X is very hot. Live with it.