• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion Intel Meteor, Arrow, Lunar & Panther Lakes + WCL Discussion Threads

Page 986 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tigerick

Senior member
Wildcat Lake (WCL) Specs

Intel Wildcat Lake (WCL) is upcoming mobile SoC replacing Raptor Lake-U. WCL consists of 2 tiles: compute tile and PCD tile. It is true single die consists of CPU, GPU and NPU that is fabbed by 18-A process. Last time I checked, PCD tile is fabbed by TSMC N6 process. They are connected through UCIe, not D2D; a first from Intel. Expecting launching in Q1 2026.

Intel Raptor Lake UIntel Wildcat Lake 15WIntel Lunar LakeIntel Panther Lake 4+0+4
Launch DateQ1-2024Q2-2026Q3-2024Q1-2026
ModelIntel 150UIntel Core 7 360Core Ultra 7 268VCore Ultra 7 365
Dies2223
NodeIntel 7 + ?Intel 18-A + TSMC N6TSMC N3B + N6Intel 18-A + Intel 3 + TSMC N6
CPU2 P-core + 8 E-cores2 P-core + 4 LP E-cores4 P-core + 4 LP E-cores4 P-core + 4 LP E-cores
Threads12688
Max Clock5.4 GHz4.8 GHz5 GHz4.8 GHz
L3 Cache12 MB6 MB12 MB12 MB
TDP15 - 55 W15 - 35 W17 - 37 W25 - 55 W
Memory128-bit LPDDR5-520064-bit LPDDR5x-7467128-bit LPDDR5x-8533128-bit LPDDR5x-7467
Size96 GB48 GB32 GB128 GB
Bandwidth83 GB/s60 GB/s136 GB/s120 GB/s
GPUIntel GraphicsIntel GraphicsArc 140VIntel Graphics
RTNoNoYESYES
EU / Xe96 EU2 Xe8 Xe4 Xe
Max Clock1.3 GHz2.6 GHz2 GHz2.5 GHz
NPUGNA 3.017 TOPS48 TOPS49 TOPS






PPT1.jpg
PPT2.jpg
PPT3.jpg



As Hot Chips 34 starting this week, Intel will unveil technical information of upcoming Meteor Lake (MTL) and Arrow Lake (ARL), new generation platform after Raptor Lake. Both MTL and ARL represent new direction which Intel will move to multiple chiplets and combine as one SoC platform.

MTL also represents new compute tile that based on Intel 4 process which is based on EUV lithography, a first from Intel. Intel expects to ship MTL mobile SoC in 2023.

ARL will come after MTL so Intel should be shipping it in 2024, that is what Intel roadmap is telling us. ARL compute tile will be manufactured by Intel 20A process, a first from Intel to use GAA transistors called RibbonFET.



LNL-MX.png
 

Attachments

  • PantherLake.png
    PantherLake.png
    283.5 KB · Views: 24,049
  • LNL.png
    LNL.png
    881.8 KB · Views: 25,534
  • INTEL-CORE-100-ULTRA-METEOR-LAKE-OFFCIAL-SLIDE-2.jpg
    INTEL-CORE-100-ULTRA-METEOR-LAKE-OFFCIAL-SLIDE-2.jpg
    181.4 KB · Views: 72,443
  • Clockspeed.png
    Clockspeed.png
    611.8 KB · Views: 72,329
Last edited:
In the 90s and early 2000s Intel was king. They owned the market and ended up creating the most open platform by accident. They do help in open source because it helps their business and attracts more clients.

If Intel truly cared about openness, x86 would be free like RISC-V. But they won’t as that’s their basically their core business.
So yeah, what exactly are AMD, Apple, and NVIDIA doing to contribute back to the personal computer and the industry in general? I'd always considered Apple to be the king of closed ecosystems, but NVIDIA's giving them good competition.
Apple is closed cause it’s a vertically integrated company, not many PC companies can afford that model. Nvidia is closed because they want to monopolise their products and IP. AMD is pretty open and uses open standards to counter Nvidia BS.

In terms of consumer products I’d say AMD and Apple contributed a lot. We got cheap Multicore processors because of AMD and we got efficient laptops because of Apple with M1 which led to Lunar Lake.

Now in 2026, Intel is far from what it was in terms early 2000s dominance then every single company depended on Intel for everything computer. Nowadays, the whole industry is very different.


Intel Invented DRAM.
No. That was Dennard who worked at IBM. Intel did commercialise it first, far from inventing it.
 
In the 90s and early 2000s Intel was king. They owned the market and ended up creating the most open platform by accident. They do help in open source because it helps their business and attracts more clients.

If Intel truly cared about openness, x86 would be free like RISC-V. But they won’t as that’s their basically their core business.
No question that, same as any business, Intel ventures into uncharted territory are made with the intent of receiving returns on that investment. The key difference up until sometime in the Otellini era was that the business goal was to improve their product and market... not the stock price. With respect to the open platform being an accident I'd disagree - developing an open platform was their best option to gain market share from the closed platforms.

In terms of consumer products I’d say AMD and Apple contributed a lot. We got cheap Multicore processors because of AMD and we got efficient laptops because of Apple with M1 which led to Lunar Lake.

Now in 2026, Intel is far from what it was in terms early 2000s dominance then every single company depended on Intel for everything computer. Nowadays, the whole industry is very different.
Heh, AMD couldn't compete with Intel in single thread so they added more cores. Yes, AMD did coerce Intel to add more cores to their consumer line in order to not fall behind in cinebench. At that time the additional cores were useless for the majority of applications, but they were celebrated by the anti-Intel crowd as the best thing ever. Now of course that the tables have turned we hear the opposite argument and that 6 or 8 cores are all that anyone needs.

Meanwhile I'd blame Apple giving Microsoft the idea of the bane of mobile PCs which is modern connected standby. I find the poor standby consumption resulting from that compared to the weeks of S3 that we used to have far, far more annoying.

Completely agree that the entire industry is very different, and in my opinion far worse off for it.
 
A 2 year foray into a new memory interface standard when there initially were no alternatives is hardly an attempt to force the industry... really just a result of Intel attempting to constantly improve their offerings and hence trying something new and unproven.
If RDRAM had been standardized, a single company (RAMBUS, Inc) would have had the entire industry by the balls, and AMD would have been paying licensing fees to produce chipsets that were not covered by the x86 cross-licensing agreement. Intel would have been able to tacnuke them at any time by having RAMBUS, Inc. kill their license.

Intel was not "more than happy" to dump RDRAM. They were forced to by off-brands like SiS producing dual DDR chipsets and humiliating Intel.
 
In the 90s and early 2000s Intel was king. They owned the market and ended up creating the most open platform by accident. They do help in open source because it helps their business and attracts more clients.

If Intel truly cared about openness, x86 would be free like RISC-V. But they won’t as that’s their basically their core business.
Just cause they don't have open sourced X86 doesn't mean they don't care about openess lmao Intel is the top contributor to open source stuff.
The issue is Intel Execs has noted that their competitor is leaching off of their efforts and they don't have money to sustain it so they are reducing their OSS effort.
 
If Intel truly cared about openness, x86 would be free like RISC-V. But they won’t as that’s their basically their core business.

Apple is closed cause it’s a vertically integrated company, not many PC companies can afford that model. Nvidia is closed because they want to monopolise their products and IP. AMD is pretty open and uses open standards to counter Nvidia BS.

People forget that the PC's "openness" is a historical accident. IBM didn't intend for it to be open, but they had to develop the PC on the cheap because IBM's C suite believed in big iron not desktop toys and gave the PC group a very small budget to work with.

So they cut corners by using industry standard parts already on the market, and made a last minute deal with Microsoft for DOS which ceded far too much control to them. By the time they decided "hey this thing is big" and wanted to take back control with their Microchannel architecture, companies who had been licensing the BIOS from IBM to build clones had built up a business they didn't want to give up, or pay IBM big bucks to license Microchannel. They were able to reverse engineer the IBM BIOS, and IBM had no way to prevent Microsoft from licensing DOS to those clones.

Had IBM maintained enough control in their deal with Microsoft that they could dictate who could and could not license DOS then the PC market would have been as closed as the Mac.
 
If RDRAM had been standardized, a single company (RAMBUS, Inc) would have had the entire industry by the balls, and AMD would have been paying licensing fees to produce chipsets that were not covered by the x86 cross-licensing agreement. Intel would have been able to tacnuke them at any time by having RAMBUS, Inc. kill their license.

Intel was not "more than happy" to dump RDRAM. They were forced to by off-brands like SiS producing dual DDR chipsets and humiliating Intel.
Heh, and why exactly would Intel have willingly handed that amount of control to Rambus? I'll happily vilify Rambus as they completely deserve it. Intel meanwhile... well, they really shouldn't have fallen for the promises of the snake oil salesman. (Here's this really great new memory interface that we want you to use that does x y and z so much better than what's currently available! Just sign this contract where you have to use it in all products you release from 1999-2001 and we'll throw in a million warrants for Rambus at $10 a share! It's a win win win!)

Anyway, if you want further insight into what actually happened some amount of information came out during the course of the Rambus v Micron & Hynix antitrust trial. A few fun tidbits from such are present in this roundtable discussion with trial counsel from each of the three parties - https://competition.scholasticahq.c...oomed-by-market-collusion-or-deficiencies.pdf Including one brief excerpt from the Micron counsel below.

Ken Nissly already provided a view of what Intel’s reasons were. There were
documents as far back as 1998, for example, where Mr. Mooring, Rambus’ president,
talked to some of Intel’s chief executives and reported back that Intel had already decided
that RDRAM would have no shot at being the next generation memory to be used in
2002 and beyond.

Intel reached this conclusion because it realized that Rambus was using Intel “as
a club” to coerce the DRAM manufacturers into expensive and unfavorable licensing
agreements. As a result, Rambus was generating a lot of ill will in the industry toward
Intel. Intel also realized that RDRAM had major technical and developmental problems
that Rambus’ engineers simply did not have the experience or expertise to solve; instead,
Rambus was happy to let Intel do all of the hard work and get a free ride.

Intel also believed, rightly, that Rambus had misrepresented the cost of RDRAM.
When convincing Intel to support RDRAM, Rambus had promised that the cost
increase of moving from SDRAM to RDRAM would be about 5%. However, by
RDRAM’s delayed introduction in late 1999, the cost increase was much higher, so
Rambus was losing credibility.
 
Heh, and why exactly would Intel have willingly handed that amount of control to Rambus?

To eliminate any competition from AMD, Cyrix/Centaur, etc. Plus to hamper any platforms dependent on JEDEC standards by forcing memory manufacturers to choose between DRAM and RDRAM. It's ancient history now, but let's not pretend that Intel has been a champion of open standards (particularly not open hardware standards).
 
To eliminate any competition from AMD, Cyrix/Centaur, etc. Plus to hamper any platforms dependent on JEDEC standards by forcing memory manufacturers to choose between DRAM and RDRAM. It's ancient history now, but let's not pretend that Intel has been a champion of open standards (particularly not open hardware standards).
Sure, feel free to ignore the many open hardware standards which Intel developed and championed. Instead pick out a singular instance of Intel deciding to try licensing a closed standard with superior specifications to alternatives available at the time and ascribe malicious intent. Despite the fact that internal e-mails and testimony which came out in the course of the Rambus lawsuits document that Intel was just as frustrated by Rambus' deception and tactics as the rest of the industry. They were just locked in by contract and large investments in design and manufacturing of the platform. What were they going to do exactly? Not release the Pentium 4 because Rambus sucked?

I think some guys forgot the first slot A mainboards from e.g. Asus coming in white unbranded boxes because they feared Intel…
Heh, I had one of those. It fell victim to the capacitor plague of the time after a year or so. I don't believe I've once claimed that Intel didn't protect the market it created vigorously. I'm mostly just saying that I prefer the generous overlord which Intel of the time was to the current crop of AMD and NVIDIA who are eagerly chasing the AI money fountain.
 
Heh, I had one of those. It fell victim to the capacitor plague of the time after a year or so. I don't believe I've once claimed that Intel didn't protect the market it created vigorously. I'm mostly just saying that I prefer the generous overlord which Intel of the time was to the current crop of AMD and NVIDIA who are eagerly chasing the AI money fountain.
Intel would do the same thing if they had a presence in AI
 
Sure, feel free to ignore the many open hardware standards which Intel developed and championed. I
In light of the fact that they almost single-handedly destroyed silicon foundry competition in the United States AND tried to wipe out all competition in the PC/workstation/server market, yes, I think I will.
 
In light of the fact that they almost single-handedly destroyed silicon foundry competition in the United States AND tried to wipe out all competition in the PC/workstation/server market, yes, I think I will.
How is Intel responsible for destroying silicon foundry competition? If Intel was so good at Fab it's not their fault.
 
How is Intel responsible for destroying silicon foundry competition? If Intel was so good at Fab it's not their fault.
I might have agreed with you once upon a time, but watching them beg for CHIPs money later cast things in a different light. People probably wouldn't be complaining as much if they were still at the top of their game.
 
I might have agreed with you once upon a time, but watching them beg for CHIPs money later cast things in a different light. People probably wouldn't be complaining as much if they were still at the top of their game.
During that time the leading edge didn't cost as much as it cost now and every country supports their foundry it's near impossible to run a leading edge foundry without being subsidized by the government.
Intel did invest the promised amount in fabs so they should have been handed the money they were bound to get.
 
Intel attempted to close up the PC ecosystem by running their competition out of business and forcing OEMs/ODMs away from DRAM and into RDRAM. Fortunately, they failed.
Intel went with RDRAM starting in 1999 because moving from parallel to serial interfaces was viewed as the future at the time and it promised to be faster than the SDR DRAM of the time. First DDR SDRAM specification was formalized mid-2000, and I believe AMD's 760 chipset was the first to support it in Q4 of 2000. Upon observing the actual performance of RDRAM Intel was quite happy to drop it in favor of DDR SDRAM starting with the 845 chipset in Q3 of 2001. A 2 year foray into a new memory interface standard when there initially were no alternatives is hardly an attempt to force the industry... really just a result of Intel attempting to constantly improve their offerings and hence trying something new and unproven.
Guys, you are treating Intel as a homogenous entity, like a single person, like we generally tend to.

Their leadership and management changed, so did the way they do things. Intel of 1980's is not the Intel of 90's and 00's and now. If I meet a friend from 20 years ago, they are still the same person. A company 20 years ago with different leaders aren't the same.
I believe Intel cores are disabled at fabric level segments. Or ring. E cores share one ring stop with shared L2. So you can only disable the stop, which means all 4 cores and L2 cache. P cores have their own ring stop, so can be disabled individually.

NovaLake P cores have clusters, so you likely would only be able to disable 2 at a time
It's their own chip and they fabricate the thing. They can disable and enable ANY thing they wish from that chip. In the 486 chips Pat Gelsinger had his own signature in it.

If they wanted they could integrate hardware to play a fart sound everytime you booted up a computer using a Celeron chip.
If Intel truly cared about openness, x86 would be free like RISC-V. But they won’t as that’s their basically their core business.
Companies change. And the bigger they get the more monopolistic tendencies they have. If/when RISC-V gets big enough they will do the same.

Also, it makes sense since the primary purpose of a company is to make money. So a "benevolent" company like Steam and "malicious" one like Nvidia has the same purpose. If the "benevolent" one gets bankrupt then all the benevolence is worth exactly zero. You must be profitable first.

I would also argue Intel could afford to be open source since they had many hundreds of times market to grow at that time. Now, they are on the opposite trend if any. So while they can do open source they need to treat it so they don't lose more revenue.

The ones making most money now are the ones pissing off people(Mag 7). Remember how irrelevant all this talk is when the leaders are Epstein people. None of these companies are on your side.
 
Last edited:
Companies change. And the bigger they get the more monopolistic tendencies they have. If/when RISC-V gets big enough they will do the same.
Opening x86 can't be done by Intel alone.

@511

If there was going to be public money for foundries, then there should have been money for the remaining legacy foundries along with IP sharing agreements. But I've gone through that in other threads and we're veering off-topic.
 
  • Like
Reactions: 511
Opening x86 can't be done by Intel alone.

@511

If there was going to be public money for foundries, then there should have been money for the remaining legacy foundries along with IP sharing agreements. But I've gone through that in other threads and we're veering off-topic.
I agree with you there should be money for other foundries and we can discuss it in foundry thread
 
Back
Top