Intel Comet Lake Thread

Page 13 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RetroZombie

Senior member
Nov 5, 2019
464
386
96
I meant the 3000 series, though the point remains valid The 9900k, overclocked, is faster than any Ryzen CPU released to date in games. This includes the 16 core model.
Just immediate returns, long therm no, and it will only get worst.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
the keys are 2, cooler and case flow

I had an 1800x @ 4.0 GHz in a Thor v2 case with massive airflow, cooled by an NH-D15S with two Noctua NF-A14 IndustrialPPC3000rpm fans pushing ~159cfm each PLUS the stock fan in a tri-fan configuration. Software read power draw as 210W in peak power consumption situations, and my layman's observations of wall power draw seemed to confirm that the chip was indeed pulling that much power. Temps were barely under control. 200W+ is not what you want to cool with an HSF. I even used Conductonaut. The noise from those fans was insane. Nobody wants to run that setup, except for crazies like me.

heat flow density of ryzen 3K is higher than Skylake 14nm

So what? Heat is heat. You can have a simple resistor plate evenly dissipating heat like the old Frostytech test setup. Once your heatpipes are saturated, you're dead.

but I dont know why are we talking about blender- if i need a MT powerhouse, I already have one and neither of 3900X or 10900K is one of them

Because someone, somewhere is going to run Blender on a 10900k. If you can't run a program like that on a 10900k, then why are we even talking about the chip at all? It's going to be expected to do everything a 9900k can do, just better in MT situations since it has two more cores. It's not just a gaming chip. In fact, it's arguably not a gaming chip at all except for seriously ST-restricted stuff like some emulators, older flight sims, uh . . . StarCraft II? The niche stuff that really needs heavy ST power, sometimes only one thread. If it can sustain 5.3 GHz by default then it'll be great for the people who need that kind of power, and it'll be the go-to for those users.

I am asking how its right

Because people like Der8auer used electrical testing equipment, probed AM4 systems, and found that AMD was telling the truth about the peak package power draw of 142W on 105W TDP Matisse chips.

But I dont like double standards.
Thats about it

Nobody is letting that other company off the hook. Take a look at the multiple nagging threads started just because some other company's CPUs don't do something right like hit precise advertised boost clocks or . . . whatever. The knives are out, and they're looking for weaknesses just to shift the narrative. Kind of sad really.

In any case, Intel never told us a 9900k would be a 160W+ chip. They won't tell you that the 10900k will chew up more than 125W . . . at least not in so many words. We'll have to find out for ourselves. Keep your eyes open, if this chip really interests you.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,572
136
Because people like Der8auer used electrical testing equipment, probed AM4 systems, and found that AMD was telling the truth about the peak package power draw of 142W on 105W TDP Matisse chips.
Yeah, AMD Peak power is exactly 1.35 x TDP. For instance my 3700X never goes above 88W (1.35*65W) even in brutal MT workloads. With that wattage it clocks at around 4.0-4.1 GHz in heavy MT benchmarks while being almost neck-to-neck with 9900K in the results.

Also AMD's Robert Hallock even mentions that you can use PBO to put your CPU to any cTDP by setting the PPT variable in BIOS (which for 3700x is 88W)
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
I had an 1800x @ 4.0 GHz in a Thor v2 case with massive airflow, cooled by an NH-D15S with two Noctua NF-A14 IndustrialPPC3000rpm fans pushing ~159cfm each PLUS the stock fan in a tri-fan configuration. Software read power draw as 210W in peak power consumption situations, and my layman's observations of wall power draw seemed to confirm that the chip was indeed pulling that much power. Temps were barely under control. 200W+ is not what you want to cool with an HSF. I even used Conductonaut. The noise from those fans was insane. Nobody wants to run that setup, except for crazies like me.
well I definitely agree that full load avx2 10900K with air won't be the most silent setup
But try to calculate the heat flow density
1800X surface area is 1/2 of 10900K without iGPU (if am not mistaken), the heat flow density from 200W+ which it will definitely pull is about half

So what? Heat is heat. You can have a simple resistor plate evenly dissipating heat like the old Frostytech test setup. Once your heatpipes are saturated, you're dead.
your heatpipes get saturated when you don't remove the excessive heat
if you can remove the heat from 2080Ti, you can from 10900K

Because someone, somewhere is going to run Blender on a 10900k. If you can't run a program like that on a 10900k, then why are we even talking about the chip at all? It's going to be expected to do everything a 9900k can do, just better in MT situations since it has two more cores. It's not just a gaming chip. In fact, it's arguably not a gaming chip at all except for seriously ST-restricted stuff like some emulators, older flight sims, uh . . . StarCraft II? The niche stuff that really needs heavy ST power, sometimes only one thread. If it can sustain 5.3 GHz by default then it'll be great for the people who need that kind of power, and it'll be the go-to for those users.
I see it as a gaming chip
what else? everything else is done better by current AMD lineup

Because people like Der8auer used electrical testing equipment, probed AM4 systems, and found that AMD was telling the truth about the peak package power draw of 142W on 105W TDP Matisse chips.
you still don't get it, I am not telling the CPU as it is consumes more, but I am talking that my 3900X whole system wall power is about 15-20W higher than expected and calculated
the same is on the web
I will try to find out where it is and create a new thread, investigation incoming

In any case, Intel never told us a 9900k would be a 160W+ chip. They won't tell you that the 10900k will chew up more than 125W . . . at least not in so many words. We'll have to find out for ourselves. Keep your eyes open, if this chip really interests you.

well it is the same with Intel and AMD now for me
unless you know that Intel's TDP means base clock- it is conservative since 3,6GHz is the base clock of 9900K and its TDP is 95W, but the web shows us handbraking at 95W at 4,1GHz so Intel must pull some hefty workload from it
AMD's 105W doesn't tell me automatically that 142W is the peak sustained power
either Intel and AMD, you must have tech knowledge to understand what it means
IMO there should be a standard, now the whole TDP thing is a big fat mess
what I am really interested is if the 10900K with the new 3080Ti will spread the gap from 3900X or not, my bet is yes but we will see
however, it is the last chance of Intel to be sidegrade from 3900X if they don't deliver something new
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136

Intel might be giving the i7 the TVB to 5.3 too now.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
your heatpipes get saturated when you don't remove the excessive heat

Wrong.

Heatpipes have maximum effective heat flux based on physical dimensions:


Science!

You can not push enough air through the fins to make attached heat pipes perform any better beyond a certain point. I was pushing over 300cfm through the fins of an NH-D15. That is insane. There's also limits on how many heatpipes you can cram into the base on an HSF and have them still pick up heat effectively.

I see it as a gaming chip
what else? everything else is done better by current AMD lineup

We'll see how it shakes out, but right now it's looking like a 9900k with two extra cores and an interesting ST boost feature that won't work in a lot of games that are multithreaded.

you still don't get it, I am not telling the CPU as it is consumes more, but I am talking that my 3900X whole system wall power is about 15-20W higher than expected and calculated

If it's system power draw rather than package power draw, it's completely irrelevant to the discussion at hand, since at that point you're observing an issue with the platform rather than the CPU itself. A 9900k doesn't chew up ~65W extra through vrms or a chipset. It's all dissipated by the CPU package, meaning your heatsink/AiO/custom water has to deal with it somehow.

well it is the same with Intel and AMD now for me

If you don't want to pay attention to some other company telling you exactly what is the package power for their CPUs will be @ default then so be it. I'll remind you again . . . Intel didn't send out any of their engineers on Twitter to tell us "Psst, the 9900k is really a 160-165W CPU, don't believe those reviews with 200W+ power consumption, those are buggy UEFI revisions that will get patched out after reviews are finished". Which really would have saved us a lot of time and trouble back when the 9900k first hit the streets. It took AT redoing their power testing:


to confirm what some users had already figured out, and that was that most Z390 systems (in particular) were not pulling over 200W like the first AT review showed in their power consumption testing. Tom's still has a power consumption test up that shows a UEFI revision allowing ridiculous power draw from the 9900k:


204W? Yow. But most 9900ks just don't do that in the real world. Not that you would know from reading that review. Intel definitely could have cleared some things up by saying, "hey, that' s not really how MCE implementations should look on commercially-available boards". Or something.
 
  • Like
Reactions: IEC

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
I had an 1800x @ 4.0 GHz in a Thor v2 case with massive airflow, cooled by an NH-D15S with two Noctua NF-A14 IndustrialPPC3000rpm fans pushing ~159cfm each PLUS the stock fan in a tri-fan configuration. Software read power draw as 210W in peak power consumption situations, and my layman's observations of wall power draw seemed to confirm that the chip was indeed pulling that much power. Temps were barely under control. 200W+ is not what you want to cool with an HSF. I even used Conductonaut. The noise from those fans was insane. Nobody wants to run that setup, except for crazies like me.



So what? Heat is heat. You can have a simple resistor plate evenly dissipating heat like the old Frostytech test setup. Once your heatpipes are saturated, you're dead.



Because someone, somewhere is going to run Blender on a 10900k. If you can't run a program like that on a 10900k, then why are we even talking about the chip at all? It's going to be expected to do everything a 9900k can do, just better in MT situations since it has two more cores. It's not just a gaming chip. In fact, it's arguably not a gaming chip at all except for seriously ST-restricted stuff like some emulators, older flight sims, uh . . . StarCraft II? The niche stuff that really needs heavy ST power, sometimes only one thread. If it can sustain 5.3 GHz by default then it'll be great for the people who need that kind of power, and it'll be the go-to for those users.



Because people like Der8auer used electrical testing equipment, probed AM4 systems, and found that AMD was telling the truth about the peak package power draw of 142W on 105W TDP Matisse chips.



Nobody is letting that other company off the hook. Take a look at the multiple nagging threads started just because some other company's CPUs don't do something right like hit precise advertised boost clocks or . . . whatever. The knives are out, and they're looking for weaknesses just to shift the narrative. Kind of sad really.

In any case, Intel never told us a 9900k would be a 160W+ chip. They won't tell you that the 10900k will chew up more than 125W . . . at least not in so many words. We'll have to find out for ourselves. Keep your eyes open, if this chip really interests you.
How long and how many time has it been reiterated in these threads that TDP is *NOT* power consumption??
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
How long and how many time has it been reiterated in these threads that TDP is *NOT* power consumption??

How long and how many times will it be reiterated that the distinction is pointless? The TDP number has no value to the consumer. It doesn't tell you how much power it consumes. It doesn't tell you what cooling solution you should buy for the CPU (since it doesn't ship with one, at least in the case of the 9900k; presumably, the10900k will be the same way).
 
  • Like
Reactions: IEC

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
How long and how many times will it be reiterated that the distinction is pointless? The TDP number has no value to the consumer. It doesn't tell you how much power it consumes. It doesn't tell you what cooling solution you should buy for the CPU (since it doesn't ship with one, at least in the case of the 9900k; presumably, the10900k will be the same way).
Exactly. They why all the bashing and hand wringing because Intel cpus exceed the TDP?
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136

Intel might be giving the i7 the TVB to 5.3 too now.
All this does is just one thing: it makes every future 10nm and 7nm Intel CPU look even worse.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Exactly. They why all the bashing and hand wringing because Intel cpus exceed the TDP?
Because they don't just exceed the TDP. They exceed the TDP with a ridiculously high amount.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
How does this make 7nm CPUs look bad??
Because clockspeeds will nowhere near the current levels. The relative performance increase will be less than what people with less knowledge of what's happened will count on, after so many years without a node shrink.
I don't mean it in a bad way or with bad wishes to Intel, and I actually find it very cool what they've achieved with 14nm, it's actually very neat.
 

geegee83

Junior Member
Jul 5, 2006
23
13
66
Because clockspeeds will nowhere near the current levels. The relative performance increase will be less than what people with less knowledge of what's happened will count on, after so many years without a node shrink.
I don't mean it in a bad way or with bad wishes to Intel, and I actually find it very cool what they've achieved with 14nm, it's actually very neat.

What frequencies should we expect in 10nm/7nm?

Is there some kind of hard limit coming?
 
  • Love
Reactions: Zucker2k

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
Because clockspeeds will nowhere near the current levels. The relative performance increase will be less than what people with less knowledge of what's happened will count on, after so many years without a node shrink.
I don't mean it in a bad way or with bad wishes to Intel, and I actually find it very cool what they've achieved with 14nm, it's actually very neat.
What a ridiculous statement to make. Intel should sit on performance now in order to make their future chips look better. That's just silly!
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
Wrong.

Heatpipes have maximum effective heat flux based on physical dimensions:


Science!

You can not push enough air through the fins to make attached heat pipes perform any better beyond a certain point. I was pushing over 300cfm through the fins of an NH-D15. That is insane. There's also limits on how many heatpipes you can cram into the base on an HSF and have them still pick up heat effectively.
you still look at it from CPU perspective, I look at it from the bottleneck perspective since we are talking about sustained MT avx2 load, which is exactly what 10900K is not designed for
so lets make a calculation, some basic stuff
NH D15 has a max airflow of 140m3/h which equal with normal relative humidity (50%) the mass flow is around 170kg/h
so with some basic recalculations to remove heat of 200W (J/s)- in 1 hour equals 720000J/h or 720kJ/h
so I add 720kJ to 170kg of air, which means 4,3KJ/kg and that means temp rise when steady flow from intake with ambient air of 22C to around 26C to outtake without VRM, chipset etc
even adding them, that you are right extreme amount of airflow has to be in balance of case airflow, which needs to be higher than the air flow through CPU cooler
that amount of air flow is not the heat removal bottleneck
heat flow:
  1. silicon to the heat spreader
  2. heat spreader to thermal compound
  3. thermal compount to the cooler base
  4. cooler base to the heatpipes
  5. heatpipes to the cooler fins
  6. cooler fins to air flowing
  7. air flow out of the case - this is often underperforming
the bottleneck is point 1 to 4 unless you want toi pull out like 300W from the CPU alone, where no air can help you
if you look at the ryzens, 2700X wraith prism can handle it with 105W TDP with pretty much the same power as my 3900X while it can't handle it
you will get to the temp limit much sooner than to the cooler limit
temperature with the same heat flow has to rise to low 90s to increase the temperature difference and thus compensate the low surface area of ryzen 3k to achieve the same heat flow 142W
NHD15 has exactly zero problems of removing 200W as absolute value, when the other enviroment supports it
so the TDP rating of our beloved CPU makers has exactly zero value as exact number
We'll see how it shakes out, but right now it's looking like a 9900k with two extra cores and an interesting ST boost feature that won't work in a lot of games that are multithreaded.
I guess it will, we will see with benches and especially with 3080Ti

If it's system power draw rather than package power draw, it's completely irrelevant to the discussion at hand,
oh no, it is the only relevant
if you care about power, then you care about the system power
it is like with a car- having a super efficient engine isn''t equal to low fuel consumption, there are other components
so I am observing higher power consumption of my 3900X system than I expect that is it

If you don't want to pay attention to some other company telling you exactly what is the package power for their CPUs will be @ default then so be it.

you are still looking at Intel's absolute value, that is not what Intel means with their TDP
IMO Intel is solving the legal thing
you will be surprised what can customers claim as bad
Intel IMO made decision, that everything about base clock with brutal load (95W 9900K is base 3,6GHz while handbrake is 4,1GHz) so pretty much nobody can claim that they don't fullfill their promise
and then if you want more, do whatever your board/cooling/whatever can handle

but that is not the point
10900K can definitely use 200+W, but with a workload not designed for it
If I buy a Porsche 911 I dont complain that it burns 40 l/100km while pulling a truck
10900K is a muscle car sprinter, not a worker, the same as 9900K
If I need a truck, I already have one so the big fat fanboy screaming messages are just showing their incompetence
any CPU clocked and volted to their limit in history showed the best content consuming performance, but the worst content creating perf/watt
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
What a ridiculous statement to make. Intel should sit on performance now in order to make their future chips look better. That's just silly!
I never said anything like that. Can you even read?
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
Because clockspeeds will nowhere near the current levels. The relative performance increase will be less than what people with less knowledge of what's happened will count on, after so many years without a node shrink.
I don't mean it in a bad way or with bad wishes to Intel, and I actually find it very cool what they've achieved with 14nm, it's actually very neat.
Okay. I expect a frequency regression with 7nm, but since Intel is being completely silent on that node, we don't have a clue yet.
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
The relative performance increase will be less than what people with less knowledge of what's happened will count on, after so many years without a node shrink.
What a ridiculous statement to make. Intel should sit on performance now in order to make their future chips look better. That's just silly!
The statement was a bit confusing.
I think what he means is that intel with zero improvements since 2015 after 5 years their new cpu release might look like it's something astonishing, when in reality it will just be what it was supposed to be.

Just imagine a realistic 6% ipc improvement per year after 6 years:
1 year
06%
2 years
12%
3 years
19%
4 years
26%
5 years
34%
6 years
42%

The cumulative performance will be very high and i can already see some websites posting something like intel increases ipc by more than 40%, but in reality after 5 years that is just meh...

I think it's the point that lobz was trying too make, i might be wrong, but my point still stands...
 
  • Like
Reactions: lightmanek

RetroZombie

Senior member
Nov 5, 2019
464
386
96
if you care about power, then you care about the system power
But that's the all point isn't it.
If everyone fakes their tdp, then i might be buying an psu where it's power will not be enough to power everything up, and the psu makers already fake their specs.

I can't have some washing machine saying it needs 2000 Watts and actually is using 5000 Watts and the house main energy fuse is always cutting the power because of that.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
The statement was a bit confusing.
I think what he means is that intel with zero improvements since 2015 after 5 years their new cpu release might look like it's something astonishing, when in reality it will just be what it was supposed to be.

Just imagine a realistic 6% ipc improvement per year after 6 years:
1 year
06%
2 years
12%
3 years
19%
4 years
26%
5 years
34%
6 years
42%

The cumulative performance will be very high and i can already see some websites posting something like intel increases ipc by more than 40%, but in reality after 5 years that is just meh...

I think it's the point that lobz was trying too make, i might be wrong, but my point still stands...
Something like this. What I really meant was, even now, Whiskey Lake makes Ice Lake look pointless. I still don't have a clue, how can 'I think what intel achieved with 14nm is very neat' mean for Zucker2k this: 'Intel should not innovate'.
All I'm saying, every step 14nm makes forward some 2-3 years after it shouldn't really exist anymore outside of chipsets, is going to add more disappointment in future products, and that's unfortunate.
 
  • Like
Reactions: spursindonesia

ondma

Platinum Member
Mar 18, 2018
2,718
1,278
136
But that's the all point isn't it.
If everyone fakes their tdp, then i might be buying an psu where it's power will not be enough to power everything up, and the psu makers already fake their specs.

I can't have some washing machine saying it needs 2000 Watts and actually is using 5000 Watts and the house main energy fuse is always cutting the power because of that.
You really think a user that pays 500.00 for a cpu and another grand or two for the rest of the system is stupid enough to think the cpu overclocked only uses 95 watts? And as others have said, the main use case for the 10900k will be gaming so the power usage will not be close the the panic numbers being bandied about here for something like AVX2 Prime 95.
 

RetroZombie

Senior member
Nov 5, 2019
464
386
96
And as others have said, the main use case for the 10900k will be gaming so the power usage will not be close the the panic numbers being bandied about here for something like AVX2 Prime 95
I'm glad you have bringed that up.

So let me see, you expect that in the future the software, games or not will never fully use all cores and that avx2 will never get properly utilized, is that it?
And that everyone uses the computer the same way, like mono tasking?
 
  • Like
Reactions: scannall