Discussion Intel Meteor, Arrow, Lunar & Panther Lakes + WCL Discussion Threads

Page 944 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tigerick

Senior member
Apr 1, 2022
939
850
106
Wildcat Lake (WCL) Specs

Intel Wildcat Lake (WCL) is upcoming mobile SoC replacing Raptor Lake-U. WCL consists of 2 tiles: compute tile and PCD tile. It is true single die consists of CPU, GPU and NPU that is fabbed by 18-A process. Last time I checked, PCD tile is fabbed by TSMC N6 process. They are connected through UCIe, not D2D; a first from Intel. Expecting launching in Q1 2026.

Intel Raptor Lake UIntel Wildcat Lake 15W?Intel Lunar LakeIntel Panther Lake 4+4+4
Launch DateQ1-2024Q2-2026Q3-2024Q1-2026
ModelIntel 150UIntel Core 7Core Ultra 7 268VCore Ultra 7 365
Dies2223
NodeIntel 7 + ?Intel 18-A + TSMC N6TSMC N3B + N6Intel 18-A + Intel 3 + TSMC N6
CPU2 P-core + 8 E-cores2 P-core + 4 LP E-cores4 P-core + 4 LP E-cores4 P-core + 4 LP E-cores
Threads12688
Max Clock5.4 GHz?5 GHz4.8 GHz
L3 Cache12 MB12 MB12 MB
TDP15 - 55 W15 W ?17 - 37 W25 - 55 W
Memory128-bit LPDDR5-520064-bit LPDDR5128-bit LPDDR5x-8533128-bit LPDDR5x-7467
Size96 GB32 GB128 GB
Bandwidth136 GB/s
GPUIntel GraphicsIntel GraphicsArc 140VIntel Graphics
RTNoNoYESYES
EU / Xe96 EU2 Xe8 Xe4 Xe
Max Clock1.3 GHz?2 GHz2.5 GHz
NPUGNA 3.018 TOPS48 TOPS49 TOPS






PPT1.jpg
PPT2.jpg
PPT3.jpg



As Hot Chips 34 starting this week, Intel will unveil technical information of upcoming Meteor Lake (MTL) and Arrow Lake (ARL), new generation platform after Raptor Lake. Both MTL and ARL represent new direction which Intel will move to multiple chiplets and combine as one SoC platform.

MTL also represents new compute tile that based on Intel 4 process which is based on EUV lithography, a first from Intel. Intel expects to ship MTL mobile SoC in 2023.

ARL will come after MTL so Intel should be shipping it in 2024, that is what Intel roadmap is telling us. ARL compute tile will be manufactured by Intel 20A process, a first from Intel to use GAA transistors called RibbonFET.



LNL-MX.png
 

Attachments

  • PantherLake.png
    PantherLake.png
    283.5 KB · Views: 24,041
  • LNL.png
    LNL.png
    881.8 KB · Views: 25,529
  • INTEL-CORE-100-ULTRA-METEOR-LAKE-OFFCIAL-SLIDE-2.jpg
    INTEL-CORE-100-ULTRA-METEOR-LAKE-OFFCIAL-SLIDE-2.jpg
    181.4 KB · Views: 72,437
  • Clockspeed.png
    Clockspeed.png
    611.8 KB · Views: 72,324
Last edited:

Khato

Golden Member
Jul 15, 2001
1,350
436
136
I bought my wife the OLED version with the AMD processor for $550 for Christmas. It has been amazing.... And the screen is something to behold 😊.
The OLED screens are indeed beautiful... for a short period of time before the eye strain and headache resulting from their horrid PWM kick in. I know such doesn't affect the majority of the population, it's just aggravating that they keep introducing new technologies that reintroduce the same known problem as was fixed previously. Same thing is present with many LED lights on the market - despite knowing that 120Hz flicker is problematic from the original magnetic ballast fluorescent drivers there are plenty of LED lights that don't perfect adequate rectification and have the exact same 120Hz flicker. When they transitioned LCD monitors to LED backlights they introduced the same flicker problem, then slowly fixed it to the point that flicker free was a selling point. Now with OLED there's the exact same flicker problem again. Some models of OLED TV and monitors have started to address it to a potentially acceptable extent, but not the case for phones and laptops yet.

Sorry, I know that's more than a bit off topic, but I can't resist the urge to grumble about that particular subject.
 
  • Like
Reactions: OneEng2

DavidC1

Platinum Member
Dec 29, 2023
2,093
3,215
106
Mainstream Pantherlake chip is the Ultra 5 338H. I never noticed real life advantages by going above 5 class chips. That's where you get the real good balance between performance and cost. Below are too neutered, so you might as well go for chips like Wildcat Lake, and above doesn't justify the theoretical advantage that is even less in the real world. Anything above 5 class is for people where the money doesn't matter for them so might as well get the best for bragging rights.

I've never noticed an inch of faster performance on my XPS i7. But the manufacturers force upsell with higher storage and RAM bundling.
The OLED screens are indeed beautiful... for a short period of time before the eye strain and headache resulting from their horrid PWM kick in. I know such doesn't affect the majority of the population, it's just aggravating that they keep introducing new technologies that reintroduce the same known problem as was fixed previously. Same thing is present with many LED lights on the market - despite knowing that 120Hz flicker is problematic from the original magnetic ballast fluorescent drivers there are plenty of LED lights that don't perfect adequate rectification and have the exact same 120Hz flicker.
You can avoid flicker if you go with the adapter powered ones. It's the direct AC connection bulbs like the screw-in that has such issues, and the overhead LED lights as well. The flickering issues are a lesser-known issue so many ignore it. EMF effects on the body even more hidden knowledge for modern electronics.

Incandescents also flicker, because they are direct connection to AC.
I don't! Lunar Lake was a very good processor for its market IMO.

Software compatibility trumps EVEN battery life!
Lunarlake is close enough that Qualcomm chips don't matter. Pre-Lunarlake, the 50% battery life may have moved more people towards WoA. Actually in some cases Lunarlake is better in battery life. If it isn't close enough, that couldn't happen.
I bought my wife the OLED version with the AMD processor for $550 for Christmas. It has been amazing.... And the screen is something to behold 😊.
I'm not a fan of the Yoga design anymore. I like the form factor, but they made power switches in a way to break apart after a few years. I would also never get OLED for burn in effects. The thing is, my XPS 12's screen looked amazing.... all for a month. Then you get used to it and it's like "meh". All it seems to do having good stuff is make everything else look worse. So if the effects are short term, and battery life is potentially worse, along with basically a ticking time bomb for degradation, why would I get OLED? Maybe if you are a graphics editor needing actual correct color.
 
Last edited:

jdubs03

Golden Member
Oct 1, 2013
1,437
1,006
136
Apple is on top except when you have to run Windows, then it is last. Snapdragon can't really run Windows either.

So it's back to AMD and Intel if you want to be on Windows.
If only we had boot camp like the olden days. Still does have Parallels though.
 

DavidC1

Platinum Member
Dec 29, 2023
2,093
3,215
106
If only we had boot camp like the olden days. Still does have Parallels though.
There are many user-specific cases where you are reluctant to move to another platform just because of that. There are excel files I need that where nothing except the full version of Excel has problems. Even the Online Excel doesn't work. Well, of course transferring files and dealing with reinstallation is a hassle too. It disrupts average workflow.

Microsoft is complacent for a reason. They know this.
 

Khato

Golden Member
Jul 15, 2001
1,350
436
136
meh current ones ship OK enough rates.
Now, SDC gen1 OLEDs had 60Hz PWM and those were PAINFUL.
Eh, I completely disagree that 240 Hz is adequate with current modulation. Even 1kHz can be problematic if it's a 100% amplitude low duty cycle modulation. Note that at 1kHz the effect can no longer be visually observed, but the eye strain/headache can still occur.

You can avoid flicker if you go with the adapter powered ones. It's the direct AC connection bulbs like the screw-in that has such issues, and the overhead LED lights as well. The flickering issues are a lesser-known issue so many ignore it. EMF effects on the body even more hidden knowledge for modern electronics.

Incandescents also flicker, because they are direct connection to AC.
Heh, I've found plenty of adapter powered ones with bad 120Hz 'sawtooth' waveforms. Conversely the Philips and Feit screw in bulbs I can buy for cheap at Home Depot show maybe a 1-2% amplitude variation and hence are just fine. I never measured an incandescent, but I'd expect a similarly low amplitude variation if any due to the different light generation mechanic. (Edit: Yes, primary reason I have an oscilloscope is in order to objectively measure light output waveforms with a PIN photodiode... the applications for other electronics hobby projects are a bonus of course.)

I wouldn't be surprised if many more people are sensitive to the issue than realize it, they just associate eye strain/headaches with screen time without realizing it's only issue because of the shortcomings of said screen.

Anyway, I'll see if I can refrain from further off topic ranting. It's only slightly on-topic because every decent PTL laptop that's been announced thus far has what's likely to be a problematic OLED display. Even with LNL there were only a few premium options that had quality IPS displays.
 
Last edited:

DavidC1

Platinum Member
Dec 29, 2023
2,093
3,215
106
Eh, I completely disagree that 240 Hz is adequate with current modulation. Even 1kHz can be problematic if it's a 100% amplitude low duty cycle modulation. Note that at 1kHz the effect can no longer be visually observed, but the eye strain/headache can still occur.
Yea, I think they tested it to several KHz and it has effects, but more on an unconscious level. Human senses are relative. Decades ago, 60Hz was enough. Now people notice 144Hz. Because they've been used to it so long, anything lower is noticed.
Heh, I've found plenty of adapter powered ones with bad 120Hz 'sawtooth' waveforms. Conversely the Philips and Feit screw in bulbs I can buy for cheap at Home Depot show maybe a 1-2% amplitude variation and hence are just fine. I never measured an incandescent, but I'd expect a similarly low amplitude variation if any due to the different light generation mechanic. I wouldn't be surprised if many more people are sensitive to the issue than realize it, they just associate eye strain/headaches with screen time without realizing it's only issue because of the shortcomings of said screen.
They probably brush it off as general health issues.

If you use a fast capture mode on your camera/phone then you can see the effect. Incandescents do flicker. Maybe the effect is much less noticed by people because the color generation is natural, whereas LEDs have high blue light content. I think the big problem is the trend for lower and lower cost and there's just nothing to do except sacrifice quality.
Anyway, I'll see if I can refrain from further off topic ranting. It's only slightly on-topic because every decent PTL laptop that's been announced thus far has what's likely to be a problematic OLED display. Even with LNL there were only a few premium options that had quality IPS displays.
I don't like the trend of OLEDs. I view them as similar to CFLs - transitory technology that's iffy at best to most people. Incandescents crush it in color quality and LEDs crush it in everything - color quality, efficiency, toxic materials.
 
  • Like
Reactions: Khato

Doug S

Diamond Member
Feb 8, 2020
3,789
6,714
136
Apple is on top except when you have to run Windows, then it is last. Snapdragon can't really run Windows either.

So it's back to AMD and Intel if you want to be on Windows.


This is why I keep repeating again and again that Apple does not consider Intel, AMD and Qualcomm their competition, and vice versa. People decide whether to go Mac or Windows as their FIRST decision, only then do they consider hardware alternatives. If you need Windows, Apple is out of the running. If you're want Mac the rest are out of the running.

Only if you are truly OS agnostic and are willing to go either way do you have the flexibility to decide between them all, but that's not a high percentage of buyers, and other factors like price or breadth of specific configuration options tailored to your exact needs may knock Apple out (the A series Macbook may help Apple with those who are looking primarily at "price" especially if it hits the lower range of speculated price point)
 

Doug S

Diamond Member
Feb 8, 2020
3,789
6,714
136
If only we had boot camp like the olden days. Still does have Parallels though.

Booting native made sense when it was x86, it doesn't when it is ARM because you're forced to compromise on compatibility in a way that's out of Apple's control. Parallels makes sense since it handles the "I want a Mac but still have to run a few Windows only apps" case - those Windows only apps will in many cases also be x86 only.
 

adroc_thurston

Diamond Member
Jul 2, 2023
8,197
10,939
106
honestly Strix halo is very disappointing here
it's an RTRT bench (FFHW/latency hiding test).
DF is dumb as usual and completely ignores the usage model of devices in question.
1600P testing across a range of upscaled and native games would've been much much more interesting (that's what laptops using stxH and PTL ship).
 

ToTTenTranz

Senior member
Feb 4, 2021
893
1,490
136
honestly Strix halo is very disappointing here

Mostly because Alex Battaglia only tested RT, because that's all he's about.

Usually I'd say it's a stupid choice for testing iGPUs for laptops and handhelds, but then again the B390 is doing great.
I guess 2026 is the year RT became a reality for affordable integrated GPUs in the PC space. It just didn't come from AMD, nor is it coming anytime soon.
 
  • Like
Reactions: Tlh97 and Krteq

ToTTenTranz

Senior member
Feb 4, 2021
893
1,490
136
PC gets AT3/4 APUs doe.
These will be pretty good, and also built off commodity dice.
They're also at least a year and a half away, though.
People aren't going to hold their breadth waiting for Medusa Halo / Premium if Panther Lake is available over a year before.

best of luck getting to gfx13, fellas.
You mean RDNA5 where AMD skimped on all the caches because "no one at AMD cares about gaming"?
We're supposed to be looking forward to those?
 

DavidC1

Platinum Member
Dec 29, 2023
2,093
3,215
106
You mean RDNA5 where AMD skimped on all the caches because "no one at AMD cares about gaming"?
We're supposed to be looking forward to those?
Saying Canis is 2x over PTL seems overstating it, as things don't scale linearly. At same power it might be say 50% or 70%.
 
Last edited:

adroc_thurston

Diamond Member
Jul 2, 2023
8,197
10,939
106
They're also at least a year and a half away, though.
yeah but it counts.
People aren't going to hold their breadth waiting for Medusa Halo / Premium if Panther Lake is available over a year before.
See that's the thing, 12Xe SKUs are meme volume.
PTL driver is a 4+0+4 4Xe part that's only ever relevant past Q2'26 ramp.
Very good BL but unremarkable otherwise.
You mean RDNA5 where AMD skimped on all the caches because "no one at AMD cares about gaming"?
They have vastly more esoteric ways to hide latency.
I mean, amdgcn could theorethically run Linux before, so why not make it an actual reality?
We're supposed to be looking forward to those?
yes why not, they have a huge team of gfx IP ppl.
Saying Canis is 2x over PTL seems overstating it, as things don't scale linearly. At same power it might be say 50% or 70%.
AMD has really really really really nice IP teams (you can do linkedin-fu to see how many core gfx R&D people AMD scavanged from NV and Intel).
It'll be fun.
 

branch_suggestion

Senior member
Aug 4, 2023
874
1,920
106
Also basically 2 years away. Intel can have Xe4.
Still well behind RDNA4 currently, and no, there will only be Xe3P by the time GFX13 is out.
Medusa Premium will show just compromised AMD iGPUs have been, reality comes for us all eventually.
They're also at least a year and a half away, though.
People aren't going to hold their breadth waiting for Medusa Halo / Premium if Panther Lake is available over a year before.
MDSP obliterates PTL in CPU/GPU perf, not even remotely fair.
I'd wait 2 years for that personally given the slowing iteration rate in laptops.
You mean RDNA5 where AMD skimped on all the caches because "no one at AMD cares about gaming"?
We're supposed to be looking forward to those?
Just because total cache area is reduced doesn't mean the thing is cache starved or anything.
Just means they've improved caching effectiveness big time, allows for more shaders in a given die area without needing additional membw which is a good PPA and BOM win.
 
  • Like
Reactions: Tlh97