Discussion Comet Lake Intel's new Core i9-10900K runs at over 90C, even with liquid cooling TweakTown

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
400 Watts? Then how is it able to run 25% cooler than a 3900x which only consumes up to 144 watts? Surely, the laws of physics do apply to the 10900k?

Damn people, 380W is not average, but some peak value achieved for a second or so.

the average CPU power consumption came in at 124 Watts... Right in line with Intel's marketing of a 125 Watt TDP rating. But at least according to the RAPL framework reporting based upon the Intel CPU metrics recorded, the actual peak power consumption of the CPU went as high as 380 Watts in extreme cases.

The Cometlake chips with soldered heatspreader are supposed to have an extra-thin layer or something. That'll help with heat.
 
  • Like
Reactions: Zucker2k

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
whatever you say. Arguing with people like you is iike talking to a brick wall.

Man you have some nerves for being a mod. It's just a CPU. Cool it buddy.

And you can clearly see from the Anandtech graph that its a peak figure.

10900K yC combo.png

If AT is calling that 254W as a power figure, I'd suggest they redo their graphs. It's about 210W. So all the CPUs need to be redone.






Mod callout. You cannot bring up the fact that a member is a mod when he is posting as a member.


esquared
Anandtech Forum Director
 
Last edited by a moderator:
  • Love
Reactions: Zucker2k

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
And they are WRONG.

This is what they measured: 10900K yC combo.png

This is what they put on their graph:
116012.png
That's a PEAK figure. The actual thing on the graph should be an average of the peak at load which is little over 200W. The peak is 254W and lows are about 175W.

Power figures are relevant for three purposes: 1) Power Bill 2) Power supply calculation 3) Thermals and matching HSF.

Which all work not on peak, but average.

No wonder the 10600K runs cool. The real average in Linpack is (130W - 60W)/2 + 60W = 95W. This makes me doubt all the other numbers.
 
  • Like
Reactions: Zucker2k

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,629
136
It's not average but it's not peak either. I guarantee you there were spikes above 254 W. It was most likely smoothed out for readability. As is true for most things, it's hard to boil down to a single number, at least they have both graphs in the review.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
@IntelUser2000

y-cruncher is a funny workload. Notice how the 254W peaks don't correspond to any change in frequency? That should set off some bells. I set my 3900x to 4.2 GHz static and ran the bench. It would bounce between 150-181W during the main run of the program. Prime95 SmallFFTs blitzed the poor CPU using the same settings to the tune of 200W(!!!) pretty steadily (for the 2 minutes I allowed it to run, gah).
 
  • Like
Reactions: Thunder 57

BigDaveX

Senior member
Jun 12, 2014
440
216
116
Power figures are relevant for three purposes: 1) Power Bill 2) Power supply calculation 3) Thermals and matching HSF.

Which all work not on peak, but average.
Any sensible person would spec their cooling and power supply to the peak, not the average. If your cooling and PSU aren't good enough to handle the regular, sustained peaks that we see on the graphs, then at best case you'll cause the CPU to throttle; at worst case you'll eventually blow the PSU and potentially take the entire system with it (and yes, I know the chances of that happening nowadays are pretty small with all the protections built into modern PSUs, but the point remains).
 

biostud

Lifer
Feb 27, 2003
18,193
4,674
136
So to sum it up:
1080p gaming with 2080/2080ti: Intel win
Gaming all other resolutions or videocard: Tie (since the GPU is the limiting factor)
Performance/watt: AMD win
Performance/$: AMD win
Productivity: AMD win
Future proof platform: AMD (PCIe 4.0 and support for zen 3)
 

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,629
136
So to sum it up:
1080p gaming with 2080/2080ti: Intel win
Gaming all other resolutions or videocard: Tie (since the GPU is the limiting factor)
Performance/watt: AMD win
Performance/$: AMD win
Productivity: AMD win
Future proof platform: AMD (PCIe 4.0 and support for zen 3)

No, this is wrong. Intel wins 900p and 720p gaming as well :p.

On a more serious note, for productivity I wouldn't say it's a 100% clear cut victory for AMD. Many productivity flows rely on a mix of single/lightly threaded loads and multi-threaded loads. So there are some cases where a 9900K or 10900K may make sense, but I agree that generally speaking for productivity, AMD is the higher performer.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Any sensible person would spec their cooling and power supply to the peak, not the average. If your cooling and PSU aren't good enough to handle the regular, sustained peaks that we see on the graphs, then at best case you'll cause the CPU to throttle; at worst case you'll eventually blow the PSU and potentially take the entire system with it (and yes, I know the chances of that happening nowadays are pretty small with all the protections built into modern PSUs, but the point remains).
If it's not sustained, you don't have an issue. It's not like the cooler is dissipating 380w continuously at the top of the ihs. It's a flash inside a core, and spreads before it even hits the ihs. Of course, with the boosting algorithms of modern processors the way they are, it's advantageous to keep these chips well below max temps to ensure longer boosting.

Watch this, pay attention to the coolers and reported temps.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
Watching this video right now. This whole power consumption ,tau, pl2 etc is confusing to me lol.

Thanks for posting this, I was just about to. Great video on cutting through the crap on various bogus testing conditions (some by mobo makers, some by testers and probably encouraged by Intel in some cases).
PL2 for the 10900K is pretty damn high (250W) per Intel spec, that's on them, but Tau is only 56 seconds for PL2. Set your mainboard for an unlimited Tau and presto, you have a toaster. That's perfectly fine for overclocking, so long as the review says they are overclocking.

Lastly, all this baloney about 1080p gaming testing is useless to me - who the heck is gaming @ 1080p with a 2080Ti (that all these reviewers use). Pointless, IMHO. This sort of CPU testing made some sense 15 years ago - time for it to go.
If people want to pick AMD, fine (I did), if people want to pick Intel, fine. So long as people are happy with their purchase, great. If Intel's new CPUs create a situation where expectations are not met, they will pay for it (just as AMD did with boost clocks).
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Thanks for posting this, I was just about to. Great video on cutting through the crap on various bogus testing conditions (some by mobo makers, some by testers and probably encouraged by Intel in some cases).
PL2 for the 10900K is pretty damn high (250W) per Intel spec, that's on them, but Tau is only 56 seconds for PL2. Set your mainboard for an unlimited Tau and presto, you have a toaster. That's perfectly fine for overclocking, so long as the review says they are overclocking.

Lastly, all this baloney about 1080p gaming testing is useless to me - who the heck is gaming @ 1080p with a 2080Ti (that all these reviewers use). Pointless, IMHO. This sort of CPU testing made some sense 15 years ago - time for it to go.
If people want to pick AMD, fine (I did), if people want to pick Intel, fine. So long as people are happy with their purchase, great. If Intel's new CPUs create a situation where expectations are not met, they will pay for it (just as AMD did with boost clocks).

1440p will become the new 1080p this fall. Seeing 1080p testing on a 2080Ti right now is perfectly fine when looking at potential performance bottlenecks moving forward. Most people keep their main system in one piece for 2-3 GPU upgrades so I see some value in it when building a new system.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
1440p will become the new 1080p this fall. Seeing 1080p testing on a 2080Ti right now is perfectly fine when looking at potential performance bottlenecks moving forward. Most people keep their main system in one piece for 2-3 GPU upgrades so I see some value in it when building a new system.
Isn't it funny how they say get more cores because future games, but when it comes to today's games, they don't want to see how more cores will perform when paired with a powerful gpu, even with modern games? If your processor is leaving performance on the table today because the gpu is too powerful, then how's your gpu upgrade path looking like against even more demanding games in the near future?
 
  • Like
Reactions: ozzy702

UsandThem

Elite Member
May 4, 2000
16,068
7,380
146
Thank you for making my case (the puny Intel cooler handles the 10400 no problem, while the Wraith Stealth struggles all the way to 95c and throttledom.

Plus, the 3600 consumes 25% more power in games and is still slower than the 10400.
That picture from a site I've never even heard of has data this is pretty different than all the "established" websites people usually link to.

95c from gaming? Sorry, not buying that.

https://www.gamersnexus.net/hwreviews/3489-amd-ryzen-5-3600-cpu-review-benchmarks-vs-intel

https://www.techpowerup.com/review/amd-ryzen-5-3600/19.html

cpu-temperature.png
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Did you even bothered to click on the image?

View attachment 21518
So 95c with a 280mm cooler and 95c vs 77c in CBR20 under similar conditions?

That picture from a site I've never even heard of has data this is pretty different than all the "established" websites people usually link to.

95c from gaming? Sorry, not buying that.

https://www.gamersnexus.net/hwreviews/3489-amd-ryzen-5-3600-cpu-review-benchmarks-vs-intel

https://www.techpowerup.com/review/amd-ryzen-5-3600/19.html

cpu-temperature.png
CBR20. It is surprisingly high, yes.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Isn't it funny how they say get more cores because future games, but when it comes to today's games, they don't want to see how more cores will perform when paired with a powerful gpu, even with modern games? If your processor is leaving performance on the table today because the gpu is too powerful, then how's your gpu upgrade path looking like against even more demanding games in the near future?

Yep. When I build a new system, I plan on at a minimum one GPU upgrade down the road before building another system. I started my 9900k build when it was released with a 1080TI, then a 2080TI and when the next gen GPUs come out I'll pop one in there as well. If I had only cared about present gaming. A 1700X wouldn't have bottlenecked me on the 1080TI much, but a 3080TI would be seriously bottlenecked by it. I like to plan for the future, but I can understand if people are either cash strapped on or on the other end of the spectrum, like to build completely new systems every 18 months.