Discussion Intel Meteor, Arrow, Lunar & Panther Lakes Discussion Threads

Page 842 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tigerick

Senior member
Apr 1, 2022
790
757
106
PPT1.jpg
PPT2.jpg
PPT3.jpg



As Hot Chips 34 starting this week, Intel will unveil technical information of upcoming Meteor Lake (MTL) and Arrow Lake (ARL), new generation platform after Raptor Lake. Both MTL and ARL represent new direction which Intel will move to multiple chiplets and combine as one SoC platform.

MTL also represents new compute tile that based on Intel 4 process which is based on EUV lithography, a first from Intel. Intel expects to ship MTL mobile SoC in 2023.

ARL will come after MTL so Intel should be shipping it in 2024, that is what Intel roadmap is telling us. ARL compute tile will be manufactured by Intel 20A process, a first from Intel to use GAA transistors called RibbonFET.



LNL-MX.png

Intel Core Ultra 100 - Meteor Lake

INTEL-CORE-100-ULTRA-METEOR-LAKE-OFFCIAL-SLIDE-2.jpg

As mentioned by Tomshardware, TSMC will manufacture the I/O, SoC, and GPU tiles. That means Intel will manufacture only the CPU and Foveros tiles. (Notably, Intel calls the I/O tile an 'I/O Expander,' hence the IOE moniker.)



Clockspeed.png
 

Attachments

  • PantherLake.png
    PantherLake.png
    283.5 KB · Views: 24,025
  • LNL.png
    LNL.png
    881.8 KB · Views: 25,517
Last edited:

511

Diamond Member
Jul 12, 2024
3,517
3,358
106
The part that should concern is that Intel has chosen the least performance, smallest possible CPU die for 18A only. It would be one thing if this was the very first part they've ever done on 18A, but isn't Panther Lake, which is also supposed to use 18A supposed to have been in full rate production and general sale for a good 6 months+ by that point? Either they have very little confidence that 18A can hit the desktop performance targets they need, or they will be massively capacity constrained and will only make what can get high volume on that in house mode.
This is on 18AP not 18A
 

Magio

Member
May 13, 2024
148
162
76
LOL
when Intel changes products from 18A/AP to N2 - 18A is sh**
when Intel gives up foundry due to money problem - doesn't matter
Money problems, technical problems, what's the difference if the end result is that Intel Foundry is screwed in either case?
 
  • Like
Reactions: OneEng2

Josh128

Golden Member
Oct 14, 2022
1,117
1,689
106
18A / 18A-P are obviously shat, either perf wise and/or capacity wise, and/or readiness compared to N2, or else Intel would be putting more on it. Panther Lake for laptops is likely going to eat up most of the limited capacity, so it makes sense if less desktop (lower margin/area) designs are done on it, especially if some performance is lacking.

Meanwhile, former Intel CEO Craig Barrett is calling for Trump to force US chipmakers to invest in a USA "fab consortium" to keep Intel fabs viable. Should make for a terrible hit on stock prices for all big hardware producers should this come to pass.

1754936586131.png
 

OneEng2

Senior member
Sep 19, 2022
750
1,004
106
18A is regular EUV. 18AP should be too.

14A will be the High NA node... Unless it gets canned.
Intel jumped all over High NA equipment too. It just seems like Intel was desperate to reclaim process leadership through spending the most money on the most cutting edge next gen equipment and processes.

It's also like they completely forgot how badly 10nm went when they tried a similar thing (arguably 10nm was considerably less risky than 18A and most certainly 14A).

We are supposed to see PTL this year. It is still possible that BSPDN and GAA provide a very good PTL processor .... albeit at lower clocks... but then laptops generally have lower clocks anyway.

We will see soon enough.
 

DrMrLordX

Lifer
Apr 27, 2000
22,767
12,776
136
18A / 18A-P are obviously shat

18AP might be okay. It's gonna be late to the game, though.

Meanwhile, former Intel CEO Craig Barrett is calling for Trump to force US chipmakers to invest in a USA "fab consortium" to keep Intel fabs viable. Should make for a terrible hit on stock prices for all big hardware producers should this come to pass.

Probably the wrong thread for this but it's interesting news nevertheless.
 

Jan Olšan

Senior member
Jan 12, 2017
558
1,100
136
18A / 18A-P are obviously shat, either perf wise and/or capacity wise, and/or readiness compared to N2, or else Intel would be putting more on it. Panther Lake for laptops is likely going to eat up most of the limited capacity, so it makes sense if less desktop (lower margin/area) designs are done on it, especially if some performance is lacking.

Meanwhile, former Intel CEO Craig Barrett is calling for Trump to force US chipmakers to invest in a USA "fab consortium" to keep Intel fabs viable. Should make for a terrible hit on stock prices for all big hardware producers should this come to pass.

View attachment 128609
If there is industry worth saving by by bailouts and public spending support, this is it, though.

All those "stargate" idiocies that they boast about are pretty stupid investments, if they are willing to let Intel go the way of GloFo at the same time, because somehow everyone throws around billions at any stupid idea but this. Retaining ability to develop and manufacture leading edge silicon? No thanks, not important.
 

Josh128

Golden Member
Oct 14, 2022
1,117
1,689
106
If there is industry worth saving by by bailouts and public spending support, this is it, though.

All those "stargate" idiocies that they boast about are pretty stupid investments, if they are willing to let Intel go the way of GloFo at the same time, because somehow everyone throws around billions at any stupid idea but this. Retaining ability to develop and manufacture leading edge silicon? No thanks, not important.
I agree that the fabs, provided they can keep up with TSMC, are significant to homeland security & defense--- however, just throwing money at the problem doesnt guarantee a solution-- Intel has to be able to produce state-of-the-art silicon at reasonable yields or its a complete waste.

I can even see some type of taxpayer funded General Motors style bailout (they got 49 billion) to get them right footed, but coercing publicly owned companies into buying into it is authoritarian communist stuff. The "shakedown" of Nvidia and AMD for 15% of sales to China is unprecedented in the previous "free market" of the US and is basically amounts to just shameful bribery.

The problem with Intels fabs is that the problem isnt just cash injection, its the cutting edge of physics and materials science and making progress there as well. It would be awesome if Intel comes out with a high NA EUV powered A14 process that just knocks everyones socks off, but if that doesnt happen, I'd be very curious to see what the end result will be.
 

OneEng2

Senior member
Sep 19, 2022
750
1,004
106
The problem with Intels fabs is that the problem isnt just cash injection, its the cutting edge of physics and materials science and making progress there as well. It would be awesome if Intel comes out with high NA EUV A14 and just knocks everyones socks off, but if that doesnt happen, I'd be very curious to see what the end result will be.
Intel has been getting bloodied up on the "cutting edge". Both with 18A, and previously with 10nm.

I think they need significantly more conservative management to correct this bad trend.
 

LightningZ71

Platinum Member
Mar 10, 2017
2,394
3,042
136
Even in a world where Intel gets enough traffic on A18 and interest in A14 to finish developing it, I don't see how they are going to get to anything beyond that. The development costs and time scale for each successive node seem to be rising at a curving rate, and Intel, even with modestly improved fortunes, seems incapable of affording that in any time length you try. They will never command enough of a price premium over TSMC for as long as TSMC can exist as an ongoing concern unless they uncover a breakthrough of epic proportions, or someone writes them a blank check.
 
  • Like
Reactions: OneEng2 and oak8292

oak8292

Member
Sep 14, 2016
180
196
116
Even in a world where Intel gets enough traffic on A18 and interest in A14 to finish developing it, I don't see how they are going to get to anything beyond that. The development costs and time scale for each successive node seem to be rising at a curving rate, and Intel, even with modestly improved fortunes, seems incapable of affording that in any time length you try. They will never command enough of a price premium over TSMC for as long as TSMC can exist as an ongoing concern unless they uncover a breakthrough of epic proportions, or someone writes them a blank check.
This isn’t normal capitalism. This is innovation until the last man is standing. ASML already got there with lithography and TSMC is getting there with logic transistor manufacturing.

These are global markets with global demand and the country that ends up with the ‘last man standing’ appears to have an advantage. However both ASML and TSMC actually have global supply chains. A collapse of global supply and demand in a ‘war’ will kill both.

The semiconductor industry is ‘fragile’ and the bull in the China shop could easily break it and set the industry back.

The U.S. ‘invented’ the semi industry and still has an outsized presence in the industry. However the demand is global and it is increasingly dependent on consumer demand versus military or commercial demand. TSMC has moved to the front of the class essentially from smartphones. Mobile drove nodes from 28nm down to 7 nm until AMD moved to TSMC. Mobile wafer volume meant that ‘trailing’. Customers like AMD and Nvidia could get low cost wafers.

It appears that Nvidia and AI may be a stronger driver going forward but the may still need mobile for ‘pipe cleaning’ nodes.

Intel essentially was a classic of ‘innovators dilemma’. They held on the margins they had in their old business and did not disrupt themselves. Mobile processors were low cost and they did not want Atom disrupting the Core business or ARM disrupting x86. When ARM and TSMC took the low end growth markets Intel was in trouble. Intel knew it and reacted with both contra revenue and Intel foundry 1.0 but they did not expected to fail as spectacularly as they did on 10 nm.
 
  • Like
Reactions: igor_kavinski

DrMrLordX

Lifer
Apr 27, 2000
22,767
12,776
136
This is innovation until the last man is standing. ASML already got there with lithography and TSMC is getting there with logic transistor manufacturing.
Only for now. Eventually the limits of silicon will be reached, and that will give multiple parties the opportunity to "catch up". Silicon manufacturing (or whatever replaces silicon) will become a commodity business since everyone will be able to do it with enough up-front investment. Many of those costs will go down over time.
 

oak8292

Member
Sep 14, 2016
180
196
116
Only for now. Eventually the limits of silicon will be reached, and that will give multiple parties the opportunity to "catch up". Silicon manufacturing (or whatever replaces silicon) will become a commodity business since everyone will be able to do it with enough up-front investment. Many of those costs will go down over time.
The ‘end game’ will be interesting. We may already be close. It seems like energy is driving node progression now more than transistor costs. The GAA will provide a decent bump in energy reduction but CFET may not do much and the costs may be higher than the value.

High NA EUV may be a technology without a home if CFET happens and it can back off on linear scaling to go 3D stacking like NAND did.

If GAA N2/N16/14 nm are the ‘end’ it will still be hard to compete on price with the leader. The leader will already have depreciated assets with high volume of wafers. Trailing fabs will still need subsidies to be price competitive.
 
  • Like
Reactions: BorisTheBlade82

Doug S

Diamond Member
Feb 8, 2020
3,392
6,011
136
The ‘end game’ will be interesting. We may already be close. It seems like energy is driving node progression now more than transistor costs. The GAA will provide a decent bump in energy reduction but CFET may not do much and the costs may be higher than the value.

High NA EUV may be a technology without a home if CFET happens and it can back off on linear scaling to go 3D stacking like NAND did.

If GAA N2/N16/14 nm are the ‘end’ it will still be hard to compete on price with the leader. The leader will already have depreciated assets with high volume of wafers. Trailing fabs will still need subsidies to be price competitive.

You're assuming there isn't anything else which can change the cost per wafer. ASML's EUV strategy is clearly not sustainable - it already isn't clear that high NA EUV is a win versus double patterning standard EUV, and may not be inserted until quad patterning would be necessary. I can't see Hyper NA tools ever entering mass production unless they're fundamentally redesigned to break the cost spiral. There are potential alternatives, none of which are ready for prime time and most of which never will be - nanoprint, DSA, FEL, ebeam. Eventually one of those will make it and ASML's technology will be quickly rendered obsolete.

If ASML isn't researching ways to disrupt itself, it's gonna be disrupted. If nothing else then eventually by China who is probably investing heavily in all of these but from what I've read seems to be viewing FEL as the best strategy in the long run. I recently read of some research that could massively reduce the size/cost of the linear accelerator it needs to power the FEL - so that instead of having to share one accelerator between all the fabs on the campus each fab or perhaps each machine would have its own. That would fundamentally alter the economics of a potential FEL solution.

Anyway more complicated transistors like CFET don't depend on drawing finer lines, the enabling technology is deposition/etch based. With BSPDN allowing relaxation of M0 while still scaling, we're at least getting a temporary reprieve on that front. A laser or ebeam based solution might someday be able to replace some of the etching steps, who knows.
 

OneEng2

Senior member
Sep 19, 2022
750
1,004
106
Even in a world where Intel gets enough traffic on A18 and interest in A14 to finish developing it, I don't see how they are going to get to anything beyond that. The development costs and time scale for each successive node seem to be rising at a curving rate, and Intel, even with modestly improved fortunes, seems incapable of affording that in any time length you try. They will never command enough of a price premium over TSMC for as long as TSMC can exist as an ongoing concern unless they uncover a breakthrough of epic proportions, or someone writes them a blank check.
... and this is the strategy hole that Intel fell down IMO. Intel made its mark (IMO) by maintaining process superiority and vertical integration. As you point out, this model has been running out of gas for well over a decade now. Intel just refused to pay attention and in fact, doubled and tripled down on the model.
Only for now. Eventually the limits of silicon will be reached, and that will give multiple parties the opportunity to "catch up". Silicon manufacturing (or whatever replaces silicon) will become a commodity business since everyone will be able to do it with enough up-front investment. Many of those costs will go down over time.
Now, that's an interesting point. When the diminishing returns are forced by physics, perhaps the innovation will turn to less expensive ways to reach the same goal.... and the low price leader model will win out over the expensive innovator model?
 

DavidC1

Golden Member
Dec 29, 2023
1,726
2,803
96
Intel essentially was a classic of ‘innovators dilemma’. They held on the margins they had in their old business and did not disrupt themselves. Mobile processors were low cost and they did not want Atom disrupting the Core business or ARM disrupting x86. When ARM and TSMC took the low end growth markets Intel was in trouble. Intel knew it and reacted with both contra revenue and Intel foundry 1.0 but they did not expected to fail as spectacularly as they did on 10 nm.
Intel failed BECAUSE of Innovator's Dilemma. They should have organically shifted to cheaper and lower power processors over time, so vendors could simply choose them rather than Intel reacting as if a train suddenly hit their operation. They resisted such transitions every single time. Also contra revenue strategy was stupid, not just because of the enormous money spent because after doing all that, they gave up anyway. Why start it at all then?

Also, Foundry 1.0 failures are entirely on them. That was Otellini's doing, but they weren't catering to customers or easy EDA or porting at all. Why would anyone use Intel in that case? Foundry 1.0 was basically a hopeful side business, a icing on the cake.
 

DavidC1

Golden Member
Dec 29, 2023
1,726
2,803
96
I agree that the fabs, provided they can keep up with TSMC, are significant to homeland security & defense--- however, just throwing money at the problem doesnt guarantee a solution-- Intel has to be able to produce state-of-the-art silicon at reasonable yields or its a complete waste.

I can even see some type of taxpayer funded General Motors style bailout (they got 49 billion) to get them right footed, but coercing publicly owned companies into buying into it is authoritarian communist stuff. The "shakedown" of Nvidia and AMD for 15% of sales to China is unprecedented in the previous "free market" of the US and is basically amounts to just shameful bribery.

The problem with Intels fabs is that the problem isnt just cash injection, its the cutting edge of physics and materials science and making progress there as well. It would be awesome if Intel comes out with a high NA EUV powered A14 process that just knocks everyones socks off, but if that doesnt happen, I'd be very curious to see what the end result will be.
Even in a world where Intel gets enough traffic on A18 and interest in A14 to finish developing it, I don't see how they are going to get to anything beyond that. The development costs and time scale for each successive node seem to be rising at a curving rate, and Intel, even with modestly improved fortunes, seems incapable of affording that in any time length you try. They will never command enough of a price premium over TSMC for as long as TSMC can exist as an ongoing concern unless they uncover a breakthrough of epic proportions, or someone writes them a blank check.
We're arguing about the wrong thing. All this talk of external, but unless the Mag 7 companies use Intel in a substantial portion the truth is the biggest share for Intel Foundry is still.... Intel itself.

Then we must ask the question why isn't Intel Product using their own Foundries? Why is Novalake using more than 50% of silicon on external? Ignore the taxpayer funded subsidies or forcing Apple/Nvidia to use Intel. That's not just stupid, that's retarded. Make Intel use Intel Foundries FIRST before they force those brain-dead ideas.
 
  • Like
Reactions: Josh128

DavidC1

Golden Member
Dec 29, 2023
1,726
2,803
96
They can't afford to properly build out. The last node they did that was 10 nm and friends.
Then how would it be ever profitable? Then what's the point of external customers? It's the same stupid question. If they cannot serve themselves, which are big but not as big as TSMC, then they are done.

No, the question I'm asking is why people are missing the obvious - Intel is still huge user of silicon so why is 90% of Novalake compute on TSMC, while having schizo thinking of "muh external customers!!!" when mostly likely those external orders are for a short time a fraction of losses from Intel itself.

If we're going to force anything, let's force INTEL to use their own foundries!

Maybe, MAYBE the reason behind this schizophrenic thinking is simpler than thought. Actions rather than words right? Maybe Intel's plan is that Foundry is going to die or the company split and everything else is a distraction.
 

jpiniero

Lifer
Oct 1, 2010
16,608
7,091
136
Then how would it be ever profitable?

Like I said, the only way it was going to work would be for Intel to get their 99% CPU Server market share back, and if not possible, to dump the fabs. Otherwise you'd have the Foundry killing the entire company.

That's where we are now.
 

DavidC1

Golden Member
Dec 29, 2023
1,726
2,803
96
Like I said, the only way it was going to work would be for Intel to get their 99% CPU Server market share back, and if not possible, to dump the fabs. Otherwise you'd have the Foundry killing the entire company.

That's where we are now.
Did you miss the discussion?

We're arguing about forcing Apple/Nvidia to use Intel or bailing them out.

How about we do something simpler and more obvious instead? Like having Intel use their own foundries? If 18A or 14A is so good to go through the hassle of moving to Intel, why isn't Intel themselves using it?

The original intention behind Intel using external is to streamline their roadmap. But apparently the company is in such a desperate situation that we're talking about potential bailouts or heck, forcing others to use Intel. How about we force Intel to use Intel? We had crappy Intel designs propped up by superior Intel process right? How about we force superior Intel design propping up crappy Intel process? For the sake of "national security" or whatever other insane reason?
 

DrMrLordX

Lifer
Apr 27, 2000
22,767
12,776
136
The ‘end game’ will be interesting. We may already be close. It seems like energy is driving node progression now more than transistor costs. The GAA will provide a decent bump in energy reduction but CFET may not do much and the costs may be higher than the value.

High NA EUV may be a technology without a home if CFET happens and it can back off on linear scaling to go 3D stacking like NAND did.

If GAA N2/N16/14 nm are the ‘end’ it will still be hard to compete on price with the leader. The leader will already have depreciated assets with high volume of wafers. Trailing fabs will still need subsidies to be price competitive.

Now, that's an interesting point. When the diminishing returns are forced by physics, perhaps the innovation will turn to less expensive ways to reach the same goal.... and the low price leader model will win out over the expensive innovator model?

It will take time, maybe even a few decades, but eventually all the tools necessary will decline in price as patent encumbrance falls away and um, creative technology sharing (which always happens) chips away at the hegemony of those with the first-mover advantage. It's still going to take specialized supply chains and expert knowledge to build and operate a fab, but it won't necessarily take $10 billion in 2025 dollars to get started. It might be to the advantage of everyone that is not TSMC to encourage processes that bring about this conclusion.

Like having Intel use their own foundries? If 18A or 14A is so good to go through the hassle of moving to Intel, why isn't Intel themselves using it?

14A doesn't exist yet so . . . anyway, as far as 18A is concerned, it may not be that good.

The original intention behind Intel using external is to streamline their roadmap. But apparently the company is in such a desperate situation that we're talking about potential bailouts or heck, forcing others to use Intel. How about we force Intel to use Intel? We had crappy Intel designs propped up by superior Intel process right? How about we force superior Intel design propping up crappy Intel process? For the sake of "national security" or whatever other insane reason?

At this point, it's not clear who cares what happens to the design side of Intel except for Intel (and maybe Barrett). Eyes are mostly on the fabs elsewhere.