Intel Comet Lake Thread

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

scannall

Golden Member
Jan 1, 2012
1,946
1,638
136
Exactly. They why all the bashing and hand wringing because Intel cpus exceed the TDP?
It would be nice to know actual numbers so you can plan for appropriate cooling for example. Or to decide to buy something else because the cooling required will be too noisy. Is it too much to ask for good information when making a purchase decision?
 
  • Like
Reactions: RetroZombie

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
I'm glad you have bringed that up.

So let me see, you expect that in the future the software, games or not will never fully use all cores and that avx2 will never get properly utilized, is that it?
And that everyone uses the computer the same way, like mono tasking?
Especially with a $500 CPU, right?
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
It would be nice to know actual numbers so you can plan for appropriate cooling for example. Or to decide to buy something else because the cooling required will be too noisy. Is it too much to ask for good information when making a purchase decision?
Maybe read some reviews from reliable sources? Naw, that would be too difficult.
I agree the Chip will use a lot of power, but this sudden hysteria that it exceeds the TDP is just absurd. TDP has never been an accurate measure of maximum power use for a "k" chip, or even for AMD cpus for that matter.
 

scannall

Golden Member
Jan 1, 2012
1,946
1,638
136
Maybe read some reviews from reliable sources? Naw, that would be too difficult.
I agree the Chip will use a lot of power, but this sudden hysteria that it exceeds the TDP is just absurd. TDP has never been an accurate measure of maximum power use for a "k" chip, or even for AMD cpus for that matter.
Because having to rely on third party reviewers for accurate numbers is so much better than getting them from the manufacturer... It was only a few years ago you would get fairly accurate TDP numbers from them. They'd push a little higher for bursts, but nothing way out of line. That's all gone out the window now in favor of marketing.
 

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,855
136
TDP has never been an accurate measure of maximum power use for a "k" chip, or even for AMD cpus for that matter.
You seem confused, why are you using the term "maximum power use" in relation to TDP?

Are you at all familiar with the way Intel defines TDP and more importantly the way they made sure their CPU abide that TDP through power management ever since Sandy Bridge?
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Exactly. They why all the bashing and hand wringing because Intel cpus exceed the TDP?

Why are you ignoring the fact that the 10900k could easily consume ~210W or more without even being overclocked? Pushing that much power through a single socket requires serious cooling. The last company to try selling such a desktop CPU in a consumer (read: non-HEDT) socket was pilloried for its ridiculous power consumption, especially when it was not the fastest CPU on the market.

you still look at it from CPU perspective

Of course I am. We're not discussing the platform or "bottlenecks", we're talking about the serious issue of a consumer CPU requiring an AiO minimum without even being overclocked. Since when was that a good idea?

we are talking about sustained MT avx2 load, which is exactly what 10900K is not designed for

Intel created the AVX2 instruction set, and the 10900k will be the fastest consumer desktop CPU in AVX2 workloads that Intel has ever released. How is this CPU not designed for MT workloads using the best SIMD Intel has to offer in the consumer desktop space?

so lets make a calculation, some basic stuff

I'm sorry, but if all you're doing is calculating how much heat air can pick up given a set delta between inlet and outlet temperature without approaching heat transfer equations, then we aren't having a real discussion about the limitations of HSFs.


And we haven't even dealt with the issue of heatpipe efficiency or limitations (see my last post). The job of heatpipes is to deliver heat to fins, and that's a very complicated job indeed. Usually this stuff is easier to handle via numerical analysis. It's still not a walk in the park though.

the bottleneck is point 1 to 4 unless you want toi pull out like 300W from the CPU alone, where no air can help you

There are some HSFs on the market that can handle heat flux of ~250-300W. Can you guess which platform hosts those heatsinks, and why they work? Don't assume that "no air can help you". Ultimately all cooling is air cooling . . . eventually. Unless you're using a heatsink/radiator buried in the ground or something.

if you look at the ryzens, 2700X wraith prism can handle it with 105W TDP with pretty much the same power as my 3900X while it can't handle it

Say what? Wraith Prism will support stock operation of a 3900x. Itwill run hot and not OC as well as a $1000 custom water cooling solution, but if all you want is stock operation, it works.

NHD15 has exactly zero problems of removing 200W as absolute value, when the other enviroment supports it

I'm calling shens. I owned and used one, and pimped it out to the fullest extent possible. Well okay I didn't put 3 IndustrialPPC fans on it, which would have been ridiculous (and would possibly have interfered with RAM). A stock NH-D15 is good for maybe 160-180W. The only air cooler I can think of for LGA1151 or the upcoming LGA1200 (which should support all the same coolers as LGA1151) that could cool such a heat load would be that IceGiant thermosiphon. Those aren't even on the market yet that I know of.

so the TDP rating of our beloved CPU makers has exactly zero value as exact number

Now you're on to something.

if you care about power, then you care about the system power

Since when did I say anything about system power? I have repeatedly stated that the issue at hand is the amount of heat that will be dissipated by the 10900k CPU package.

there are other components

. . . that have separate apparatuses for cooling that can be addressed individually. And that should not serve as a distraction from the fact that Intel is going to sell a chip that can dissipate over 200W out-of-the-box!

so pretty much nobody can claim that they don't fullfill their promise

Intel releases TDP/PL1/PL2 design documents that have almost no resemblance to CPU behavior in "real world" implementations of Intel's consumer desktop platform.

10900K is a muscle car sprinter, not a worker, the same as 9900K

Again, that's blatantly false. That Intel can't produce a better "worker" on their consumer desktop platform is Intel's fault . . . not the fact that it's a "muscle sprint car".
 
Last edited:
  • Love
Reactions: lobz

TheGiant

Senior member
Jun 12, 2017
748
353
106
Why are you ignoring the fact that the 10900k could easily consume ~210W or more without even being overclocked? Pushing that much power through a single socket requires serious cooling. The last company to try selling such a desktop CPU in a consumer (read: non-HEDT) socket was pilloried for its ridiculous power consumption, especially when it was not the fastest CPU on the market.
I am not
I am just saying my own setup
gaming it will consume like 120W with occasional spikes
handbraking I would change the power profile - underlock/undervolt and be on the optimal perf/whole system power curve

Of course I am. We're not discussing the platform or "bottlenecks", we're talking about the serious issue of a consumer CPU requiring an AiO minimum without even being overclocked. Since when was that a good idea?
well that is the translation of the cinebench winners
MT is overrated for desktop
are we seriously talking home gaming top system to be challenged as workhorse?
10900K isn't a competitor to 3900X
understand why Intel makes money while AMD does have better product..because it doesn't always have

Intel created the AVX2 instruction set, and the 10900k will be the fastest consumer desktop CPU in AVX2 workloads that Intel has ever released. How is this CPU not designed for MT workloads using the best SIMD Intel has to offer in the consumer desktop space?
sure its good
and why should I pull a truck with full throttle porsche 911 when I don't need to

I'm sorry, but if all you're doing is calculating how much heat air can pick up given a set delta between inlet and outlet temperature without approaching heat transfer equations, then we aren't having a real discussion about the limitations of HSFs.
exactly
that is my experience
ppl buying top notch air cooler while forgeting about the air....
And we haven't even dealt with the issue of heatpipe efficiency or limitations (see my last post). The job of heatpipes is to deliver heat to fins, and that's a very complicated job indeed. Usually this stuff is easier to handle via numerical analysis. It's still not a walk in the park though.
for me it is, I am doing CFD...walk in the park is not when your system has a chemical reaction, especially with phase change
but that is not the discussion for here

There are some HSFs on the market that can handle heat flux of ~250-300W. Can you guess which platform hosts those heatsinks, and why they work? Don't assume that "no air can help you". Ultimately all cooling is air cooling . . . eventually. Unless you're using a heatsink/radiator buried in the ground or something.
sure they can when the air meets the criteria, again which many ppl forget
water just transfers the heat outside the case and it solves the situation

Say what? Wraith Prism will support stock operation of a 3900x. Itwill run hot and not OC as well as a $1000 custom water cooling solution, but if all you want is stock operation, it works.
it doesn;'t, sorry high 80s and low90s isn't working
the heat flow density of 3900X is just too high
there is only 0/1 decision, either you buy 3rd party cooler or you don't
I'm calling shens. I owned and used one, and pimped it out to the fullest extent possible. Well okay I didn't put 3 IndustrialPPC fans on it, which would have been ridiculous (and would possibly have interfered with RAM). A stock NH-D15 is good for maybe 160-180W. The only air cooler I can think of for LGA1151 or the upcoming LGA1200 (which should support all the same coolers as LGA1151) that could cool such a heat load would be that IceGiant thermosiphon. Those aren't even on the market yet that I know of.
I have one and imo it will cool even 250W, when the flow outside is in balance and the CPU surface is "big enough", which with less nm becomes impossible

Now you're on to something.
Since when did I say anything about system power? I have repeatedly stated that the issue at hand is the amount of heat that will be dissipated by the 10900k CPU package.
I am rreferring to the misconception of power consumption
Power means system power not CPU power
CPU power is only important if you can't cool it while! doing your designed job
CPU package power of 10900K will be 130W imo while gaming
that is the primary job
muscle car
not everyone is happy winning the cinebenech internetz with 550 EUR CPU with benchmarking 4K EUR SW lincense, as I said If I need a truck I have one


. . . that have separate apparatuses for cooling that can be addressed individually. And that should not serve as a distraction from the fact that Intel is going to sell a chip that can dissipate over 200W out-of-the-box!
sure It can
and be the stupids that push it there
we are here on anandtech forums
who the f. will buy here the 10900K as workhorse? when 3950x is out, when TR3x is out
I wouldn't even bother looking for 10900K when considering rendering or MT stuff like primary usage
Intel releases TDP/PL1/PL2 design documents that have almost no resemblance to CPU behavior in "real world" implementations of Intel's consumer desktop platform.
that is THE translation
the so called productivity pretenders are saying the max power output but are not considering the use case
I can configure my i5 6600K to consume 150W and then translate that 4c/4t with that power is so low
don't get me wrong, current Intel implementation for the average Joe lead to 200W+, but we are not avrage Joes here
Again, that's blatantly false. That Intel can't produce a better "worker" on their consumer desktop platform is Intel's fault . . . not the fact that it's a "muscle sprint car".
ok
I think your statement is a reflection of the current situation here on the tech forums
the result will tell us, not otherwise
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
10900K isn't a competitor to 3900X
understand why Intel makes money while AMD does have better product..because it doesn't
:laughing: :laughing: Well that is just plain wrong. Intel does not make money because the 9900K is a better product - even for gaming. Understand that intel makes their money from everything else other than the 9900K. Worst example you could have used, since the 9900K is barely in any OEM products, where Intel actually sells, and where Intel makes a lot more money than AMD. You know, the segment, where in the short term it doesn't make a difference if you have a product that's decent, very good or complete garbage. DIY? Sure the 9900K sells there, but every AMD CPU sells better, some even much better.
Question is, do you understand why?
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
:laughing: :laughing: Well that is just plain wrong. Intel does not make money because the 9900K is a better product - even for gaming. Understand that intel makes their money from everything else other than the 9900K. Worst example you could have used, since the 9900K is barely in any OEM products, where Intel actually sells, and where Intel makes a lot more money than AMD. You know, the segment, where in the short term it doesn't make a difference if you have a product that's decent, very good or complete garbage. DIY? Sure the 9900K sells there, but every AMD CPU sells better, some even much better.
Question is, do you understand why?
do you understand you are an extremist person?
the fighting against reply hero style ?
I said Intel makes money when they have worse product than AMD for WHAT, not who
get some sleep and stop trying to to fight against and present you own content, don't say I am wrong but how it is right
so combination of product/use case/price/....
and you translated it to fighting agains with some 9900K ? WTF
anandtech forums should do better than this
stop using fighting and who and start using what and for what
I am sorry I won't be able to to respond with very high probability, going with 3 kids for a week of skiing
and not on the porsche 911 (as much as I like it)
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
do you understand you are an extremist person?
the fighting against reply hero style ?
I said Intel makes money when they have worse product than AMD for WHAT, not who
get some sleep and stop trying to to fight against and present you own content, don't say I am wrong but how it is right
so combination of product/use case/price/....
and you translated it to fighting agains with some 9900K ? WTF
anandtech forums should do better than this
stop using fighting and who and start using what and for what
I am sorry I won't be able to to respond with very high probability, going with 3 kids for a week of skiing
and not on the porsche 911 (as much as I like it)
Your reply is golden. Says everything about you and how serious I should take what you say, without me needing to say anything more. Thanks :) happy skiing
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
I never said anything like that. Can you even read?
I was trying to be polite with that reply, but since you insist on defending that anti-Intel knee-jerk reply, why don't you explain this:
All this does is just one thing: it makes every future 10nm and 7nm Intel CPU look even worse.
Sorry, this is not even silly, it is dumb! Intel giving its customers performance today does not matter to you. You'd rather they hold off in order that some future chip look better against what they've already released. You should just go ahead and admit your frustration that Intel is going to be increasing it's single/few core lead on the desktop against your beloved AMD. You're not fooling anyone around here.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
I was trying to be polite with that reply, but since you insist on defending that anti-Intel knee-jerk reply, why don't you explain this:

Sorry, this is not even silly, it is dumb! Intel giving its customers performance today does not matter to you. You'd rather they hold off in order that some future chip look better against what they've already released. You should just go ahead and admit your frustration that Intel is going to be increasing it's single/few core lead on the desktop against your beloved AMD. You're not fooling anyone around here.
Which lead? If you mean strictly mobile and up to 4 cores, than why not? I mean, why not just wait and see? Even then, your post will be just a reaction to an assumption you made and nothing else. If not, you're just making a fool out of yourself. Feel free to contradict me with actual facts / numbers. Can't wait.

The best part is my beloved AMD. I have a 4790K and a 1070, it was by far the best CPU/GPU combo to get for ~$300 / 400 when I upgraded my rig. What exactly is your problem? I symphatized with AMD back then, and mainly I've just felt sorry for them, but sorrow is not something you should take into account when you make a monetary decision. Did you know?
 
Last edited:
  • Like
Reactions: spursindonesia

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
well that is the translation of the cinebench winners
MT is overrated for desktop

It wasn't when Intel was winning the benchmarks. When the 4770k stomped all over every chip any competitor could make in MT, do you think people here said, "well that's not important, since it's MT"? Of course they didn't.

are we seriously talking home gaming top system to be challenged as workhorse?

We did when Intel had the best "workhorse" CPUs for a consumer socket. Why are the standards different now?

it doesn;'t, sorry high 80s and low90s isn't working

It meets the manufacturer spec. I don't like it either (no OC headroom, high temps), but I'm not pigheaded enough to think it's non-functional.

CPU power is only important if you can't cool it while! doing your designed job
CPU package power of 10900K will be 130W imo while gaming

CPU package power should be lower than that, since the 10900k will engage about as many cores as the 9900k in the vast majority of current titles out there (and the vast majority of future titles in the next few years). 9900k is sub-100W in a lot of games.

that is the primary job
muscle car
not everyone is happy winning the cinebenech internetz with 550 EUR CPU with benchmarking 4K EUR SW lincense, as I said If I need a truck I have one

CPUs are not cars and trucks. They are expected to do anything you might need them to do.

who the f. will buy here the 10900K as workhorse? when 3950x is out, when TR3x is out
I wouldn't even bother looking for 10900K when considering rendering or MT stuff like primary usage

Then why is Intel even bothering with a CPU that is just a 9900k with two extra cores? You have just negated its entire reason for existence. Are you perhaps now seeing why people might be critical of Intel's decision to move forward with Comet Lake?

but we are not avrage Joes here

Most people who buy Intel CPUs will not go into the UEFI to change CPU settings.
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
It wasn't when Intel was winning the benchmarks. When the 4770k stomped all over every chip any competitor could make in MT, do you think people here said, "well that's not important, since it's MT"? Of course they didn't.



We did when Intel had the best "workhorse" CPUs for a consumer socket. Why are the standards different now?



It meets the manufacturer spec. I don't like it either (no OC headroom, high temps), but I'm not pigheaded enough to think it's non-functional.



CPU package power should be lower than that, since the 10900k will engage about as many cores as the 9900k in the vast majority of current titles out there (and the vast majority of future titles in the next few years). 9900k is sub-100W in a lot of games.



CPUs are not cars and trucks. They are expected to do anything you might need them to do.



Then why is Intel even bothering with a CPU that is just a 9900k with two extra cores? You have just negated its entire reason for existence. Are you perhaps now seeing why people might be critical of Intel's decision to move forward with Comet Lake?



Most people who buy Intel CPUs will not go into the UEFI to change CPU settings.
If the 10900k can reach 5 ghz all core (like the 9900k), it could well be faster in future games than 9900k. I expect cpu requirements, including cores/threads to increase with the new consoles.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
If the 10900k can reach 5 ghz all core (like the 9900k), it could well be faster in future games than 9900k. I expect cpu requirements, including cores/threads to increase with the new consoles.

The consoles are going to use 8c/16t parts. Why would console ports require more cores than that? Also, I've noticed that even console ports from 8c/8t machines (PS4/Xbox 1) do not always use 8 threads on PC . . .
 

JasonLD

Senior member
Aug 22, 2017
485
445
136
The consoles are going to use 8c/16t parts. Why would console ports require more cores than that? Also, I've noticed that even console ports from 8c/8t machines (PS4/Xbox 1) do not always use 8 threads on PC . . .

1 core is always reserved for OS and background processing so 7 will be the highest number of cores/threads games can utilize on those consoles.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
1 core is always reserved for OS and background processing so 7 will be the highest number of cores/threads games can utilize on those consoles.

That was the strategy with the previous gen. Not sure if they'll do the same thing on PS5 etc. but we'll see.
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
The consoles are going to use 8c/16t parts. Why would console ports require more cores than that? Also, I've noticed that even console ports from 8c/8t machines (PS4/Xbox 1) do not always use 8 threads on PC . . .
Well, look at current games. I am not only talking about cores/threads, but total cpu power (throughput if you will). The current consoles have 8 anemic cores, but on the PC they can easily utilize six or even eight much more powerful cores. Why would you think that PC ports from a much more powerful console cpu could not utilize 10 or even more cores?
 

JasonLD

Senior member
Aug 22, 2017
485
445
136
That was the strategy with the previous gen. Not sure if they'll do the same thing on PS5 etc. but we'll see.

They will most likely do the same thing this gen and will likely reserve 1-2 cores for OS and other background functions. They are doing this to ensure every consoles will have same amount of CPU resources available for games regardless of how many apps are running on background.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Well, look at current games. I am not only talking about cores/threads, but total cpu power (throughput if you will). The current consoles have 8 anemic cores, but on the PC they can easily utilize six or even eight much more powerful cores. Why would you think that PC ports from a much more powerful console cpu could not utilize 10 or even more cores?

That's not how threading works. In general, games' ability to use multiple cores is baked into the game engine. You would have to redesign the engine from the ground up to make it spawn more worker threads that carry out meaningful amounts of work. Either that, or you would have to have a game engine designed to use as many threads as you specify in your game files, which (to my knowledge) does not yet exist.

edit:

Just doing some brief observations in DMC5 (a fairly-recent port of a PS4/Xbox1 game), it looks like the game only utilizes 6 threads, based on core utilization reported in HWiNFO64. Assuming @JasonLD is correct, I would expect the next-gen consoles to utilize 12-14 threads max, and maybe not that many.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,855
136
The current consoles have 8 anemic cores, but on the PC they can easily utilize six or even eight much more powerful cores. Why would you think that PC ports from a much more powerful console cpu could not utilize 10 or even more cores?
Are console ports running the same graphical details and frame rates as the originals? Some graphic settings do have quite an impact on CPU usage. Combine that with higher frame rates and computing needs can go through the roof fast.
 
  • Like
Reactions: mopardude87

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
Are console ports running the same graphical details and frame rates as the originals? Some graphic settings do have quite an impact on CPU usage. Combine that with higher frame rates and computing needs can go through the roof fast.

Only counter argument which even i could apply here is that its gonna be all about the resolution. My i5 8400t goes from about maxed out all the time at 1080p in COD WW2 with my 1080ti severely bottlenecked with its 140-180 or so fps to the gpu becoming the bottleneck at 4k with its 55-80 or so frame rate while that 8400t looks VERY bored .

Hoping Comet Lake arrives soon with something juicy to compete with AMD so i could perhaps forget the 3900x purchase but yikes on 14nm still their 10 core absolutely is going to need water isn't it? I haven't followed Comet much since i heard about 14nm.