• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Intel Comet Lake Thread

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moinmoin

Golden Member
Jun 1, 2017
1,402
1,204
106
I meant the 3000 series, though the point remains valid The 9900k, overclocked, is faster than any Ryzen CPU released to date in games. This includes the 16 core model.
Which is not what DrMrLordX was talking about:
I am not confident that Comet Lake-S will be a faster gaming processor than the 9900k or 9900ks. As such, I do not think it will be faster than Zen3 either.
The future Comet Lake-S may well not be a faster gaming processor than the 9900k or 9900ks. But Zen 3 based Ryzen 4000 pretty likely will be faster than Zen 2 based Ryzen 3000 in gaming. So it's likely Comet Lake-S will be slower in gaming than Ryzen 4000.
 
  • Like
Reactions: spursindonesia

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
19,785
7,161
136
I meant the 3000 series, though the point remains valid The 9900k, overclocked, is faster than any Ryzen CPU released to date in games. This includes the 16 core model.
By the hair on its chinny, chin chin ! on average like what 1% ? It sure is not much faster. And from what I read, its actually smoother with a better low 0.1%.
 

jpiniero

Diamond Member
Oct 1, 2010
7,453
949
126
By the hair on its chinny, chin chin ! on average like what 1% ? It sure is not much faster. And from what I read, its actually smoother with a better low 0.1%.
Depends on the game and the GPU. Could be as high as 15-20%.
 

TheGiant

Senior member
Jun 12, 2017
639
243
86
If you're just gaming (or other very burst/latency demanding workloads), which let's face it is really the main if not only reason anyways you'd be buying Comet Lake over Zen 3, the sustained power consumption will not come anywhere close to the limits
this
I didn't buy 9900K because of power, but not many seem to care
the 9900K/10900K is presented as power hog, which it is by default of defaults settings by MB makers but as you said while gaming one can't reach that power or not consistently
if 10C comet lake was out with my ryzen 3900x, it would be like 50/50 buying decision
I don't need 10C 5GHz for 300W, I can have 4,3GHz 10C for 1xx W
I am not confident that Comet Lake-S will be a faster gaming processor than the 9900k or 9900ks. As such, I do not think it will be faster than Zen3 either.
well I am pretty confident comet lake s 10C will claim the sub 10% gain as 9900K claimed over 8700K even in titles you won't consider as thread limited
Note that the TDP of the 9900k is listed as 95W, while the majority of Z390 boards cause it to operate as a ~160W CPU. It wasn't just a few loose cannons. The "loose cannons" had it operating as a 210W CPU.
TDP again? oh come on, my 3900X has 105W tdp and at default it burns wall power of 240W while handbraking
TDP must dissapear in current form, it means zero

if Intel delivers 6/12 I5 with igpu for 200EUR, thats a problem for AMD and their profits
this is the year of change
by the end of the year AMD will deliver zen 4k and Intel will deliver big fat nothing on desktop
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
19,785
7,161
136
That's what I mean, on average with 3200 memory it should be 15-20% faster unless you are GPU limited.
Against the 2800x, its only 6% faster here: https://www.tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-13.html
Now I am going to see if I can find the same type average for 3700x or 3900x, obviously it will be much closer.

Edit: I found that 3 different sites says its 5% slower in games on average, and 15% was the highest difference.(Kingdom come: deliverance, never heard of it)
 
Last edited:
  • Like
Reactions: spursindonesia

jpiniero

Diamond Member
Oct 1, 2010
7,453
949
126

jpiniero

Diamond Member
Oct 1, 2010
7,453
949
126
Did you miss the Zen 2 launch?

Back in July 2019 Techpowerup had 7.4% difference at 720p, less than 5% @ 1080p and less than 3% @ 1440p. All running on EVGA GeForce RTX 2080 Ti FTW3 Ultra and DDR4 3200 CL14.
See the "unless you are GPU limited" part.

I would simply add to the 'OMG Gaming' comments. If you are using a mid range graphics card, or say 4k gaming on a high end card the difference is zero. Pick the parts that best fill your use case, and fit your build budget.
You would think that CPU limited benchmarks would give you an idea of how well the CPU will hold up over time. But since the new consoles have such a big CPU performance jump, it's hard to predict.
 

scannall

Golden Member
Jan 1, 2012
1,621
963
136
You would think that CPU limited benchmarks would give you an idea of how well the CPU will hold up over time. But since the new consoles have such a big CPU performance jump, it's hard to predict.
If you are building something today, and for say the next 4 years it doesn't make a lot of sense to overspend on the CPU, leaving less of your budget for the GPU. If your budget is unlimited, then sure get a 9900KS and a couple 2080ti's. If your budget is $850 for example, then a 9900KS or for that matter anything more expensive than a AMD 3600 makes no sense.

The good thing is now there are actual choices to be made on how to get the best performance for your tasks, and in the budget you have to work with.
 

ondma

Senior member
Mar 18, 2018
920
188
86
I would simply add to the 'OMG Gaming' comments. If you are using a mid range graphics card, or say 4k gaming on a high end card the difference is zero. Pick the parts that best fill your use case, and fit your build budget.
I very seriously doubt anyone is going to buy a 10900k to pair with a midrange gpu.
 

scannall

Golden Member
Jan 1, 2012
1,621
963
136
I very seriously doubt anyone is going to buy a 10900k to pair with a midrange gpu.
I've seen it more than a few times from the um.. less tech literate. All they saw was "This is the best gaming CPU! Buy today!" Not understanding it's just one part of a system.
 

tamz_msc

Platinum Member
Jan 5, 2017
2,443
2,112
106
Gamers Nexus charts which include CPU limited high refresh rate gaming has the 9900K OC delivering much more than 10 percent higher frame rates than a 3900X in a number of games.
 

DrMrLordX

Lifer
Apr 27, 2000
14,607
3,584
136
I meant the 3000 series, though the point remains valid The 9900k, overclocked, is faster than any Ryzen CPU released to date in games. This includes the 16 core model.
I was discussing Zen3, not Zen2/Ryzen 3000.

this
I didn't buy 9900K because of power, but not many seem to care
the 9900K/10900K is presented as power hog, which it is by default of defaults settings by MB makers but as you said while gaming one can't reach that power or not consistently
You'll care when you can't cool it. See below.

if 10C comet lake was out with my ryzen 3900x, it would be like 50/50 buying decision
Well it wasn't. The earliest Comet Lake-S could have hit the streets was Dec 2019. Instead we got the 9900KS.

I don't need 10C 5GHz for 300W, I can have 4,3GHz 10C for 1xx W
. . . why? Do you really imagine that a 4.3 GHz 10c would be that much better than the 9900k which usually sits at 4.7 GHz with 8c? Even in perfect MT scenarios, the hypothetical gains would be miniscule (14% throughput), and you'd lose ST performance. Maybe you could get around that by twiddling the UEFI a bit to enforce 125W strictly to get some ST performance back. The last thing Intel has in its favor right now is gaming performance, and you'd lose that by deliberately running your chip at a lower clockspeed than what you could sustain with a product released in 2018.

well I am pretty confident comet lake s 10C will claim the sub 10% gain as 9900K claimed over 8700K even in titles you won't consider as thread limited
Why? I'm not. It'll have some extra L3 cache, but diminishing returns and all that.

TDP again?
Yes. Until Intel and/or board OEMs come clean about how much power the chip will really burn by default, we have to use reviews and user experience to inform us as to how hot the thing will get when not limited by an inadequate cooler. A 9900k can pull 160W or more without changing anything. A 10900k? We don't know. It'll be "up to" 300W, but again, I doubt that more than a few review boards will inflict that behavior. Maybe if you're a Day 1 buyer with a UEFI meant for reviewers, you'll get that. But it's going to be more than 125W (listed TDP), and probably more than the 160W we get from the 9900k today. 210W would not surprise me, and that's only if they keep MT clocks @ 4.7 GHz using default settings. If they try for higher clocks, expect that value to go up.

It takes a lot of cooling capacity to handle a 210W CPU, even if it isn't prone to hotspotting. I was barely able to handle a load that high with an NH-D15 and really loud fans. This thing will be essentially AiO/custom water or bust.

if Intel delivers 6/12 I5 with igpu for 200EUR, thats a problem for AMD and their profits
Who cares about AMD? It'll be a problem for Intel and their profits.
 
  • Like
Reactions: spursindonesia

TheGiant

Senior member
Jun 12, 2017
639
243
86
You'll care when you can't cool it. See below.
well you can cool it
the heat flow density makes it all, 14nm isnt as dense as the ryzens 3k
my 3900x default cooler is crap for 105W tdp cpu as was 2700X where it was ok
so the conclusion is, anyway you need upper mid tier and above case and ventilation/water to cool any of modern high performance desktop CPUs, be it ryzens or cores

Well it wasn't. The earliest Comet Lake-S could have hit the streets was Dec 2019. Instead we got the 9900KS.
better than nothing, but for us it is not good enough that is clear

. . . why? Do you really imagine that a 4.3 GHz 10c would be that much better than the 9900k which usually sits at 4.7 GHz with 8c? Even in perfect MT scenarios, the hypothetical gains would be miniscule (14% throughput), and you'd lose ST performance. Maybe you could get around that by twiddling the UEFI a bit to enforce 125W strictly to get some ST performance back. The last thing Intel has in its favor right now is gaming performance, and you'd lose that by deliberately running your chip at a lower clockspeed than what you could sustain with a product released in 2018.
maybe I didn't write it clear
I don't need the MT throughput of the 3900X- it is now undervolted and I lost 7% of handbraking performance and my wall power went down from 245W to 195W
so if I have to buy 10900K, I will probably enable the full throttle when gaming -which won't reach it anyway to satisfy my min fps 144Hz gaming need
for the handbraking I would underloclock and undervolt it to the perf/power effective point
even at 4GHz the MT througput is enough for my desktop needs

Why? I'm not. It'll have some extra L3 cache, but diminishing returns and all that.
definitely, I am just saying I thought the same about 9900K vs 8700K

Yes. Until Intel and/or board OEMs come clean about how much power the chip will really burn by default, we have to use reviews and user experience to inform us as to how hot the thing will get when not limited by an inadequate cooler. A 9900k can pull 160W or more without changing anything. A 10900k? We don't know. It'll be "up to" 300W, but again, I doubt that more than a few review boards will inflict that behavior. Maybe if you're a Day 1 buyer with a UEFI meant for reviewers, you'll get that. But it's going to be more than 125W (listed TDP), and probably more than the 160W we get from the 9900k today. 210W would not surprise me, and that's only if they keep MT clocks @ 4.7 GHz using default settings. If they try for higher clocks, expect that value to go up.

It takes a lot of cooling capacity to handle a 210W CPU, even if it isn't prone to hotspotting. I was barely able to handle a load that high with an NH-D15 and really loud fans. This thing will be essentially AiO/custom water or bust.
definitely, but for me as enthusiast it looks more predictable
for high end gaming builds, the CPU takes the less power portion than the GPU
let's be clear, the same applies to AMD
I didn't expect from my 105W TDP 3900X with b450 board to burn 245W from the wall (default, not undervolted) with pretty much the same components as with my 6600K
I got a bug in my head and can't explain it since the first ryzen 3k reviews and it is about expected power draw from the wall
my 6600K burns 106W while handbraking (undervolted) and the monitoring tool shows 49W package power
so pure CPU power 49W, VRM loss 13%, PSU loss 14% (low power means lower efficiency)=> 49/(0,87*0,86)= 66W so the all other components with all the losses while doing pure CPU job (handbraking) consume 40W
so having pretty much the same components ( I tried to change the RAM, GPU between the systems and wall power difference was 4W) you would expect if you system consumes 245W from the wall, that the CPU/chipset/VRM combo burns the rest
so 245 -45W =200W
lets calculate 200*0,87(VRM)*0,9(Better modern PSU)=157W
but HW info tells me 143W
I see this difference in all internet reviews, expected power draw of ryzen systems from the wall should be like 15W less
15W you can't cool passively without massive heatsink
I suspect the monitoring tools don't tell us everything- the mobile surface book with ryzen with the same as cores told us the ryzen burns less power, but the result was lower battery life
there can be only one explanation for me, the CPU is effective, other no so much
or I have an error in the calculations
I am writing this because TDP of not only Intel, but AMD the same tells us nothing unless you understand the tech spec and the rules (intel base clock, AMD don't know)
however, when it comes to pure task energy, the new ryzens are better, when it comes to standard corporate workload- lots of idles, some medium loads- Intel cores do better IMO
Who cares about AMD? It'll be a problem for Intel and their profits.
I do
Intel needs a serious punch in the face and there is no other market player
it will definitely hurt Intel profit, but it will hurt AMD's market share gain
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
14,607
3,584
136
so the conclusion is, anyway you need upper mid tier and above case and ventilation/water to cool any of modern high performance desktop CPUs, be it ryzens or cores
You can cool <insertrandomMatissechiphere> with Wraith Prism or (even better) something like a Dark Rock Pro 4 or NH-D15, no problem. It won't get you elite all-core OC but the returns on high-capacity ambient cooling on those chips is just not there. Believe me, I threw everything I could at this 3900x and didn't get much to show for it. Maybe 100-200 MHz all-core OC over someone with an NH-D15? Good for me, but not really necessary.

However

I can honestly tell you that a similar cooler will struggle with 200W or more, which is a mark the 10900k will easily eclipse in productivity workloads. For gaming, it'll probably use the same amount of power as a 9900k and produce the same performance, because games won't use the extra two cores.

even at 4GHz the MT througput is enough for my desktop needs
If that is true, then what you need is a 9900k with MCE enabled. It can easily hit MT clocks of 4.7 GHz and deliver similar performance as a 10900k @ 4 GHz. Not really sure where you're going with that.

but HW info tells me 143W
The chip itself isn't pulling more than 142W, which means that's the most you'll have to handle with your HSF. And AMD flatly told everyone publicly that that's the max package power draw for any of the 105W TDP chips. You can do in-depth power measurements at the component level and you'll find that AMD is pretty much telling the truth. Sorry if your napkin math doesn't add up. Go bug Der8auer if you don't believe me.

15W you can't cool passively without massive heatsink
Here's a hint: that power isn't being dissipated by the CPU or anything in the CPU package. Those power limits are real.

I do
Intel needs a serious punch in the face and there is no other market player
Don't worry. Intel is doing a good job of punching themselves in the face.
 

TheGiant

Senior member
Jun 12, 2017
639
243
86
You can cool <insertrandomMatissechiphere> with Wraith Prism or (even better) something like a Dark Rock Pro 4 or NH-D15, no problem. It won't get you elite all-core OC but the returns on high-capacity ambient cooling on those chips is just not there. Believe me, I threw everything I could at this 3900x and didn't get much to show for it. Maybe 100-200 MHz all-core OC over someone with an NH-D15? Good for me, but not really necessary.
but this is it, 200 MHz is the difference between 3900X 12C and 3950X 16C and they have the same power
at this lvl of power curve we play with 10s of MHz

However

I can honestly tell you that a similar cooler will struggle with 200W or more, which is a mark the 10900k will easily eclipse in productivity workloads. For gaming, it'll probably use the same amount of power as a 9900k and produce the same performance, because games won't use the extra two cores.
again if you can cool 3900X, you can cool 10900K if you have mid up tier case with sufficient air flow
heat flow density is the key

If that is true, then what you need is a 9900k with MCE enabled. It can easily hit MT clocks of 4.7 GHz and deliver similar performance as a 10900k @ 4 GHz. Not really sure where you're going with that.
no
9900K can't provide me the gaming min FPS (probably, has to be confirmed) and can be configured as efficient MT CPU
I give power where I need it
gaming is where I need it (while it doesn't consume that much) and MT can be more power effiecient- so as I said it is 50/50, 3900X is less gaming/better task energy per job and 10900K will be more gaming and worse task energy per job
but if you mix it with other components the differences get lower

The chip itself isn't pulling more than 142W, which means that's the most you'll have to handle with your HSF. And AMD flatly told everyone publicly that that's the max package power draw for any of the 105W TDP chips. You can do in-depth power measurements at the component level and you'll find that AMD is pretty much telling the truth. Sorry if your napkin math doesn't add up. Go bug Der8auer if you don't believe me.
napkin math? that is it?
so instead of saying how it is right you have this?
no sorry sir, I don't need to go to anandtech to find benchmark reposters and cinebench runners, I can find that elsewhere
If you have the correct calculation how a 142W CPU can consume 240W from the wall please explain
I can not find the answer, this is the only website I found to present CPU only AND whole system power at the same workload https://www.hardwareluxx.de/index.php/artikel/hardware/prozessoren/50163-amds-ryzen-7-3700x-und-ryzen-9-3900x-im-test.html?start=12
and it shows the same what I found, there is like 15W out there somewhere
as the absolute power goes down, the difference gets more visible
like the 8700K consuming 16W more just from the CPU than 3700X, but has the same system power
I can not find the power source
so AMD doesn't lie about the CPU power, but I suspect they are not telling us the whole story
system power is what matters not just the CPU
 

krumme

Diamond Member
Oct 9, 2009
5,868
1,452
136
Gamers Nexus charts which include CPU limited high refresh rate gaming has the 9900K OC delivering much more than 10 percent higher frame rates than a 3900X in a number of games.
How relevant is it in the future in just 2 years time?
With the new consoles will we still be primarily latency limited in the high end or will it tilt even more towards throughput?
Testing a 2080ti at 1080 on current gen cpu doesnt give us an answer on that.
My bet is the next bf series will take their toll on the cpus throughput. Already a 6c/12t amd or Intel is throughput limited. I am surpriced its like that when we havnt even changed consoles.
Hopefully we have some interesting new games in front of us.
 

DrMrLordX

Lifer
Apr 27, 2000
14,607
3,584
136
again if you can cool 3900X, you can cool 10900K if you have mid up tier case with sufficient air flow
heat flow density is the key
And I'm telling you, you're wrong. You can cool a 3900x with a stock NH-D15, and it'll handle stock operation just fine, even in stuff like Blender. You won't be able to handle a 10900k in an AVX2 workload (Blender) with an NH-D15. It'll throttle. Maybe you won't care since you only seem to want one for games . . . and sadly, the 10900k will offer zero real advantage there, most-likely.

no
9900K can't provide me the gaming min FPS (probably, has to be confirmed) and can be configured as efficient MT CPU
I give power where I need it
If the 9900k can't do it, then the 10900k probably won't, either. Nothing will. The 9900k still has the highest minfps of any x86 CPU on the market in CPU-limited scenarios (1080p, high fps). The 10900k doesn't look to change that. Maybe the limited boost range of 5.3 GHz will come into play, but it looks like it won't on any modern software (even games) that use too many threads.

napkin math? that is it?
so instead of saying how it is right you have this?
Yeah, sorry, your kludge math is "napkin math" since you didn't go to the trouble of getting out a multimeter and doing socket voltage/power testing. You can't run down software testing and then turn around and use wall power measurements and estimates to try to pretend that you're more accurate. But this is a 10900k thread so if you really want to fight over the socket power draw of an AM4 system, you can hit up the usual threads. Not gonna keep this going here.

Bottom line is, Intel never told anyone exactly how much power the 9900k was really going to use, and now from testing/experience, we know. It's gonna be the same way with the 10900k. You can try to draw false equivalences with some other CPU manufacturer to try to justify the problem (never mind that quibbling over 15W that isn't even being dissipated from the CPU package is ridiculous). If we want to guess at how much the 10900k will really use, you can take my previous posts as an idea, or we can do some simple "napkin math" based on clues Intel has given us. They were nice enough to list the 10900k's TDP (125W). 9900k has a TDP of 95W. So maybe it's safe to assume that real power draw for the 10900k will be ~31.5% higher than the 9900k on that basis; sadly, the 8700k and 9700k have the same TDP, so that estimate may not be accurate at all. But if the 10900k is pulling 31.5% more power than a 9900k, that would put it somewhere around 215W in "real world" use, assuming mobo OEMs treat it the same way that they treated the 9900k. And I'l repeat: you do not want to cool a 215W workload with any HSF on the market. Unless you like bouncing off temp limits. Some people are okay with that.
 

TheGiant

Senior member
Jun 12, 2017
639
243
86
And I'm telling you, you're wrong. You can cool a 3900x with a stock NH-D15, and it'll handle stock operation just fine, even in stuff like Blender. You won't be able to handle a 10900k in an AVX2 workload (Blender) with an NH-D15. It'll throttle. Maybe you won't care since you only seem to want one for games . . . and sadly, the 10900k will offer zero real advantage there, most-likely.
about games as I said you may be right, we will see
about the cooling, I think you are wrong
the keys are 2, cooler and case flow
if you case can handle gaming power of CPU (9900K and 3900X while gaming consume as CPU+other components about very much the same) +300W 2080Ti or so you can handle the CPU power of 10900K
heat flow density of ryzen 3K is higher than Skylake 14nm
ofc, 10900K full loaded with GPU full loaded is different
but I dont know why are we talking about blender- if i need a MT powerhouse, I already have one and neither of 3900X or 10900K is one of them

If the 9900k can't do it, then the 10900k probably won't, either. Nothing will. The 9900k still has the highest minfps of any x86 CPU on the market in CPU-limited scenarios (1080p, high fps). The 10900k doesn't look to change that. Maybe the limited boost range of 5.3 GHz will come into play, but it looks like it won't on any modern software (even games) that use too many threads.
as I said I thought the same about 8700K vs 9900K, it turned different

Yeah, sorry, your kludge math is "napkin math" since you didn't go to the trouble of getting out a multimeter and doing socket voltage/power testing. You can't run down software testing and then turn around and use wall power measurements and estimates to try to pretend that you're more accurate. But this is a 10900k thread so if you really want to fight over the socket power draw of an AM4 system, you can hit up the usual threads. Not gonna keep this going here.
I am sorry man, I am not interested in yours to prove me wrong and in any other measurements
I am asking how its right
So if you dont have anything, just dont say anything
you are right about the derailing, I am sorry this is the comet lake not am4 thread
I will make a new one

Bottom line is, Intel never told anyone exactly how much power the 9900k was really going to use, and now from testing/experience, we know. It's gonna be the same way with the 10900k. You can try to draw false equivalences with some other CPU manufacturer to try to justify the problem (never mind that quibbling over 15W that isn't even being dissipated from the CPU package is ridiculous). If we want to guess at how much the 10900k will really use, you can take my previous posts as an idea, or we can do some simple "napkin math" based on clues Intel has given us. They were nice enough to list the 10900k's TDP (125W). 9900k has a TDP of 95W. So maybe it's safe to assume that real power draw for the 10900k will be ~31.5% higher than the 9900k on that basis; sadly, the 8700k and 9700k have the same TDP, so that estimate may not be accurate at all. But if the 10900k is pulling 31.5% more power than a 9900k, that would put it somewhere around 215W in "real world" use, assuming mobo OEMs treat it the same way that they treated the 9900k. And I'l repeat: you do not want to cool a 215W workload with any HSF on the market. Unless you like bouncing off temp limits. Some people are okay with that.
I am not justifying anything. I couldnt care less about internet hero discussions. But I dont like double standards.
Thats about it
 

ASK THE COMMUNITY