Question Alder Lake - Official Thread

Page 89 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

deasd

Senior member
Dec 31, 2013
517
746
136
The really bad power scaling of Celeron G6900 is evidence that something other than the GC cores is gulping up gobs of power. Is it possible that they fused off the other two GC cores in the die in such a way that they are not able to function but still using power during active usage of the functional cores?

I suspect the silicon binning plays a big role here, best silicon became high end model while worse silicon went to low end. Pentium&Celeron are victim, i3s are also not good if you look at 12100F which is tested together(higher power than 10100F which is also 4C8T)
 
  • Like
Reactions: igor_kavinski

Exist50

Platinum Member
Aug 18, 2016
2,445
3,043
136
What is YOUR problem is more the question. I was commenting on what idiot would pay $200 more for the same CPU that would use even more power than the 12900k.
It’s not like we’ve never seen such products before. AMD’s entire FX 9000 series being the most egregious example, but to a lesser extent also chips like the Coffee Lake 8086k. And it’s very common on the GPU side. Don’t see what all the fuss is about, especially since it can at least claim to be the absolute best CPU.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,554
14,510
136
It’s not like we’ve never seen such products before. AMD’s entire FX 9000 series being the most egregious example, but to a lesser extent also chips like the Coffee Lake 8086k. And it’s very common on the GPU side. Don’t see what all the fuss is about, especially since it can at least claim to be the absolute best CPU.
The best Intel makes..... Otherwise its debatable.
 
  • Like
Reactions: Drazick

Abwx

Lifer
Apr 2, 2011
10,947
3,457
136

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
In MT it wont, unless they increase all 16 cores frequencies by more than 9%, and of course we wont discuss power drain...


It trades blows with the 5950X, you can get different aggregates / geomeans depending on the benchmarks and apps used to come up with those rankings.

For example: THG https://cdn.mos.cms.futurecdn.net/VucudWo8bsQAWkjhMjNy2f-970-80.png.webp
and TPU: https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/23.html

I think the 12900K (and by extension, the 12900KS) is competitive enough with the 5950X to basically call it a wash in MT, but it edges it in gaming and moderately threaded workloads, so overall it is technically 'the best' CPU in terms of performance even if the 5950X might edge it in some MT workloads. Call it a split decision, if I can use a boxing analogy. 2/3... it ain't a KO, but its enough to get the win.

Now, Is it worth the premium for a 12900KS? Probably not if you're after price/performance. But neither is an RTX 3090 vs a 3080 Ti right? Its all relative. $200 means nothing to someone spending $5K on an insane gaming PC where only the 'best of the best' will do.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,554
14,510
136
It trades blows with the 5950X, you can get different aggregates / geomeans depending on the benchmarks and apps used to come up with those rankings.

For example: THG https://cdn.mos.cms.futurecdn.net/VucudWo8bsQAWkjhMjNy2f-970-80.png.webp
and TPU: https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/23.html

I think the 12900K (and by extension, the 12900KS) is competitive enough with the 5950X to basically call it a wash in MT, but it edges it in gaming and moderately threaded workloads, so overall it is technically 'the best' CPU in terms of performance even if the 5950X might edge it in some MT workloads. Call it a split decision, if I can use a boxing analogy. 2/3... it ain't a KO, but its enough to get the win.

Now, Is it worth the premium for a 12900KS? Probably not if you're after price/performance. But neither is an RTX 3090 vs a 3080 Ti right? Its all relative. $200 means nothing to someone spending $5K on an insane gaming PC where only the 'best of the best' will do.
I totally agree, except one thing everyone keeps forgetting. It does that at almost twice the power draw. So that another point that makes it arguable that the 12900k or ks is the best. Its the best at sucking power and generating heat, thats not arguable.
 
  • Like
Reactions: Drazick

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
I totally agree, except one thing everyone keeps forgetting. It does that at almost twice the power draw. So that another point that makes it arguable that the 12900k or ks is the best. Its the best at sucking power and generating heat, thats not arguable.
Yeah, its definitely not an efficient CPU due to Intel pushing the voltage/frequency curve to (IMO) crazy levels to well... 'get the win'. Its a win at all costs strategy, probably dictated by the marketing team more than the engineering team.

Personally I would undervolt a 12900K to maintain close to stock clocks but at more sane power levels. Intel even does this on their own chips, the all core clocks on a 12900 non K is only 100MHz lower than the 12900K (4.9GHz vs 5.0GHz) but has a PL1 of approx 200W instead of 241W.

Though to be fair, I doubt those building those multi thousand dollar gaming rigs care too much about power efficiency. After all, an RTX 3090 is 350W+ and the rumoured TDP of a 3090 Ti is 450W! So an extra 150W on the CPU is hardly going to make them blink an eye. Again, its all relative. At the bleeding edge its a totally different market where efficiency and value for the dollar aren't high on the list of priorities.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,554
14,510
136
Yeah, its definitely not an efficient CPU due to Intel pushing the voltage/frequency curve to (IMO) crazy levels to well... 'get the win'. Its a win at all costs strategy, probably dictated by the marketing team more than the engineering team.

Personally I would undervolt a 12900K to maintain close to stock clocks but at more sane power levels. Intel even does this on their own chips, the all core clocks on a 12900 non K is only 100MHz lower than the 12900K (4.9GHz vs 5.0GHz) but has a PL1 of approx 200W instead of 241W.

Though to be fair, I doubt those building those multi thousand dollar gaming rigs care too much about power efficiency. After all, an RTX 3090 is 350W+ and the rumoured TDP of a 3090 Ti is 450W! So an extra 150W on the CPU is hardly going to make them blink an eye. Again, its all relative. At the bleeding edge its a totally different market where efficiency and value for the dollar aren't high on the list of priorities.
My 3080TI is rated at 400 watts from the factory ! EVGA GeForce RTX 3080 Ti FTW3 ULTRA GAMING. I downed the wattage to 250, and for what I do (F@H) its almost the same performance. And again, I agree. And those that think I hate Intel, check the build thread where I recommended a 12700k for the person for a gaming box.
 
  • Like
Reactions: Drazick

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
My 3080TI is rated at 400 watts from the factory ! EVGA GeForce RTX 3080 Ti FTW3 ULTRA GAMING. I downed the wattage to 250, and for what I do (F@H) its almost the same performance. And again, I agree. And those that think I hate Intel, check the build thread where I recommended a 12700k for the person for a gaming box.

I can relate to that. My 3070 Ti runs at 320W stock, but I altered the voltage curve so that it only peaks at 210W now but it still basically performs at stock performance levels, only ~3% off or thereabouts.

I don't think you hate Intel Mark, but I think you value efficiency more than most DIY enthusiasts (understandable based on what you do, FWIW) so I can see why you would favour a 5950X compared to a 12900K even though they perform similarly and are in the same price ballpark.
 
  • Like
Reactions: controlflow

Abwx

Lifer
Apr 2, 2011
10,947
3,457
136
It trades blows with the 5950X, you can get different aggregates / geomeans depending on the benchmarks and apps used to come up with those rankings.

For example: THG https://cdn.mos.cms.futurecdn.net/VucudWo8bsQAWkjhMjNy2f-970-80.png.webp
and TPU: https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-12th-gen/23.html

I think the 12900K (and by extension, the 12900KS) is competitive enough with the 5950X to basically call it a wash in MT, but it edges it in gaming and moderately threaded workloads, so overall it is technically 'the best' CPU in terms of performance even if the 5950X might edge it in some MT workloads. Call it a split decision, if I can use a boxing analogy. 2/3... it ain't a KO, but its enough to get the win.

Now, Is it worth the premium for a 12900KS? Probably not if you're after price/performance. But neither is an RTX 3090 vs a 3080 Ti right? Its all relative. $200 means nothing to someone spending $5K on an insane gaming PC where only the 'best of the best' will do.

For MT it lose more often than it win, otherwise there wouldnt be thoses 9% difference at Computerbase, TPU use lowly threaded benches in their avarage and that s why they display only 3% difference.

Back to system power TPU state 179W for the 5950X and 297W for the 12900K and that s in Cinebench, i wouldnt call such a CPU competitive, or else the FX9590 was more than competitive against a i7 3770K.
 
Last edited:

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
For MT it lose more often than it win, otherwise there wouldnt be thoses 9% difference at Computerbase, TPU use lowly threaded benches in their avarage and that s why they dislkay only 3% difference.

Back to system power TPU state 179W for the 5950X and 297W for the 12900K and that s in Cinebench, i wouldnt call such a CPU competitive, or else the FX9590 was more than competitive against a i7 3770K.

Again, if you are like Mark and value efficiency then there is no argument to be had that the 5950X wins that category. The 12900K and especially the 12900KS uses a brute force approach to get competitive MT performance...

I totally get your point, but surely you can acknowledge that there is a place for the 12900K/S for performance / watt aren't high on the agenda.
 

Abwx

Lifer
Apr 2, 2011
10,947
3,457
136
Again, if you are like Mark and value efficiency then there is no argument to be had that the 5950X wins that category. The 12900K and especially the 12900KS uses a brute force approach to get competitive MT performance...

I totally get your point, but surely you can acknowledge that there is a place for the 12900K/S for performance / watt aren't high on the agenda.

Eventually for games since only the P cores will be put to contribution.

In MT that s a different story, even the compiling stuff where it is deemed as very good (when 8 cores are enough) doesnt hold a candle, what if i have several projects to compile and that i launch several instances.?.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,554
14,510
136
Eventually for games since only the P cores will be put to contribution.

In MT that s a different story, even the compiling stuff where it is deemed as very good (when 8 cores are enough) doesnt hold a candle, what if i have several projects to compile and that i launch several instances.?.
Personally, I think the Alter lake cores (P cores) are good for gaming and LIGHT multitasking, but for anything serious, the E cores make them a joke. For true multi core use, all cores should be equal IMO.
 
  • Like
Reactions: Drazick

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
It’s not like we’ve never seen such products before.

9900KS anyone?

what if i have several projects to compile and that i launch several instances.?.

There are some workloads where having duplicate P cores instead of a bank of E cores will provide you with consistency, at the very least. Power users that want more cores to do more at once, aren't necessarily going to be happy with the "let's just add more Gracemont" approach. If Intel had a healthy offering of HEDT parts then the only issue would be one of total platform cost. But they don't even have that right now.
 
  • Like
Reactions: Joe NYC

MadRat

Lifer
Oct 14, 1999
11,910
238
106
I'd think you'd want to rotate through the P cores for low priority tasks and ramp up the number of active E cores for higher priority work as the system demands. Unfortunately it doesn't sound like there is any performance boost between the two, its more a binary question of if a P core cannot do it then it has to route to an E core. So does that mean everything starts at the P core?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The really bad power scaling of Celeron G6900 is evidence that something other than the GC cores is gulping up gobs of power. Is it possible that they fused off the other two GC cores in the die in such a way that they are not able to function but still using power during active usage of the functional cores?

Probably just terrible sample and/or binning.

Heck, now I am thinking... even the Pentium Gold might suffer against a potential Octa core Pentium Silver chip.

I don't think they'll call the 8 core Pentium Silver anymore. Also the TDP range is definitely getting extended beyond 10W if that's the case.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
Probably just terrible sample and/or binning.



I don't think they'll call the 8 core Pentium Silver anymore. Also the TDP range is definitely getting extended beyond 10W if that's the case.
I have to agree. That's because I see the Atom brand returning. So expecting this lineup...
- Pentium "Silver" being the Octa Core chips with 15W TDP (potential)
- Celeron "Silver" being Quad Core with high clock speed and with 10W TDP
- Atom brand with the Quad Core with lower clock speed but going below 6W TDP

Seems that Intel is planning something and now I am intrigued.
Also wondering if there might be Raptop Lake -N with more cores.

Meanwhile... what happened with the Full Quad Cores without HT? Why those didn't became the Pentium Gold chips leaving the Celeron "Gold" with the Dual Cores of any kind?
 

Timorous

Golden Member
Oct 27, 2008
1,611
2,764
136
Surely if you are looking at a 12900K or a 5950X for MT workloads you will choose based on what is better for your intended workloads rather than a geomean of several disparate workloads.

Also the 5950X scales pretty well with move power in MT workloads so if you allow it more power headroom you do get a good performance boost.
 

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
Also the 5950X scales pretty well with move power in MT workloads

To a point. It's still on N7, and N7 CPUs hit a wall eventually. My 3900X hit at wall at around 4.4 GHz, for example, where power draw wasn't the enemy so much as out-of-control hotspot temperatures. Plus there's the whole unsafe voltage thing (which may be less of an issue on Vermeer; I honestly haven't researched it).

I think with a 12900k you can just put heavy watercooling on it and ramp up power to the moon. With a 5950X I would honestly be surprised to see anyone pushing more than 200-220W through it.