Discussion Intel Meteor, Arrow, Lunar & Panther Lakes + WCL Discussion Threads

Page 958 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Tigerick

Senior member
Apr 1, 2022
941
857
106
Wildcat Lake (WCL) Specs

Intel Wildcat Lake (WCL) is upcoming mobile SoC replacing Raptor Lake-U. WCL consists of 2 tiles: compute tile and PCD tile. It is true single die consists of CPU, GPU and NPU that is fabbed by 18-A process. Last time I checked, PCD tile is fabbed by TSMC N6 process. They are connected through UCIe, not D2D; a first from Intel. Expecting launching in Q1 2026.

Intel Raptor Lake UIntel Wildcat Lake 15W?Intel Lunar LakeIntel Panther Lake 4+0+4
Launch DateQ1-2024Q2-2026Q3-2024Q1-2026
ModelIntel 150UIntel Core 7Core Ultra 7 268VCore Ultra 7 365
Dies2223
NodeIntel 7 + ?Intel 18-A + TSMC N6TSMC N3B + N6Intel 18-A + Intel 3 + TSMC N6
CPU2 P-core + 8 E-cores2 P-core + 4 LP E-cores4 P-core + 4 LP E-cores4 P-core + 4 LP E-cores
Threads12688
Max Clock5.4 GHz?5 GHz4.8 GHz
L3 Cache12 MB12 MB12 MB
TDP15 - 55 W15 W ?17 - 37 W25 - 55 W
Memory128-bit LPDDR5-520064-bit LPDDR5128-bit LPDDR5x-8533128-bit LPDDR5x-7467
Size96 GB32 GB128 GB
Bandwidth136 GB/s
GPUIntel GraphicsIntel GraphicsArc 140VIntel Graphics
RTNoNoYESYES
EU / Xe96 EU2 Xe8 Xe4 Xe
Max Clock1.3 GHz?2 GHz2.5 GHz
NPUGNA 3.018 TOPS48 TOPS49 TOPS






PPT1.jpg
PPT2.jpg
PPT3.jpg



As Hot Chips 34 starting this week, Intel will unveil technical information of upcoming Meteor Lake (MTL) and Arrow Lake (ARL), new generation platform after Raptor Lake. Both MTL and ARL represent new direction which Intel will move to multiple chiplets and combine as one SoC platform.

MTL also represents new compute tile that based on Intel 4 process which is based on EUV lithography, a first from Intel. Intel expects to ship MTL mobile SoC in 2023.

ARL will come after MTL so Intel should be shipping it in 2024, that is what Intel roadmap is telling us. ARL compute tile will be manufactured by Intel 20A process, a first from Intel to use GAA transistors called RibbonFET.



LNL-MX.png
 

Attachments

  • PantherLake.png
    PantherLake.png
    283.5 KB · Views: 24,042
  • LNL.png
    LNL.png
    881.8 KB · Views: 25,531
  • INTEL-CORE-100-ULTRA-METEOR-LAKE-OFFCIAL-SLIDE-2.jpg
    INTEL-CORE-100-ULTRA-METEOR-LAKE-OFFCIAL-SLIDE-2.jpg
    181.4 KB · Views: 72,439
  • Clockspeed.png
    Clockspeed.png
    611.8 KB · Views: 72,325
Last edited:

adroc_thurston

Diamond Member
Jul 2, 2023
8,439
11,170
106
BTW, I meant Dell (rather than HP) not offering Strix Halo.
Dell barely started offering AMD in normal commercial.
Intel did other things, such as added LP cores, added NPU (in order to stay in corporate PCs), improved the iGPU.
that's nice. where's the CPU perf? this is 2026, not 2023. QCOM is breaching 4k GB6 1t.
Which makes it even more important that the design printed on the silicon is the most competitive, not stale.
no, it means you gotta run tf away from that market until you're out of breath.
 

mikegg

Platinum Member
Jan 30, 2010
2,110
653
136
I halved my CPU's clockspeed in BIOS, and ran the test again. The LCP took 780ms, which is only took 7% longer than stock. So it looks like chrome devtools CPU throttling is inaccurate, just not in the direction you expected.

For GPU rendering, input, OS scheduling, etc. I don't see your point. We are discussing Intel Laptop CPUs scoring lower in javascript benchmarks. They might also throttle GPU or have bad input latency, but I've seen no reputable claims for that.
That’s because clock speed and performance isn’t linear. Chrome dev tools literally divide the JS performance by 4.

For GPU rendering, it’s relevant because we are talking about battery life while browsing here. Intel is going to throttle the entire chip, CPU and GPU.
 

511

Diamond Member
Jul 12, 2024
5,394
4,816
106
it's better for intel to keep working on a in house Linux Distro for their hw and support it and focus on it rather than rely on microsoft same for AMD microslop has been making windows bad
 

511

Diamond Member
Jul 12, 2024
5,394
4,816
106
That’s because clock speed and performance isn’t linear. Chrome dev tools literally divide the JS performance by 4.

For GPU rendering, it’s relevant because we are talking about battery life while browsing here. Intel is going to throttle the entire chip, CPU and GPU.
throttling is power profile dependent you don't get throttle on performance mode you get it on balanced so there is that
 

mikegg

Platinum Member
Jan 30, 2010
2,110
653
136
throttling is power profile dependent you don't get throttle on performance mode you get it on balanced so there is that
Trace the replies back. No Youtuber ever tests battery life on performance mode. It's always the default mode.

PS. Windows does throttle on performance mode as well.
 

coercitiv

Diamond Member
Jan 24, 2014
7,470
17,850
136
PS. Windows does throttle on performance mode as well.
Let's do the data thing:

Header Cell - Column 0Qualcomm Snapdragon X Elite: Plugged in (Best Performance)Qualcomm Snapdragon X Elite: On battery (Best Performance)Qualcomm Snapdragon X Elite: On battery (Balanced)Intel Core Ultra 200V: Plugged in (Best Performance)Intel Core Ultra 200V: On battery (Best Performance)Intel Core Ultra 200V: On battery (Balanced)AMD Ryzen AI Max+ Pro 395: Plugged in (Best Performance)AMD Ryzen AI Max+ Pro 395: On battery (Best Performance)AMD Ryzen AI Max+ Pro 395: On battery (Balanced)
Geekbench 6: Single core (Higher is better)2,3792,2832,3092,7212,6991,5992,8372,6472,224
Geekbench 6: Multicore (Higher is better)12,3409,2077,58911,03510,9889,05817,72114,24413,416
Cinebench R23: Single core (Higher is better)1,0831,0731,1041,8291,8911,0331,9301,9401,050
Cinebench R23: Multicore (Higher is better)10,13011,1029,9418,4317,9527,87529,46924,20819,733
Handbrake video encoding (Lower is better)06:5011:3008:0907:2608:5008:4402:3603:2703:37
3DMark Fire Strike (Higher is better)5,8005,7954,9658,4627,4127,78723,45916,15115,593
3DMark Time Spy (Higher is better)1,8731,8911,8033,8963,5573,73510,1147,3626,689

Let's take a look at GB6 Multicore, which is quite representative for a mixed workload running on a laptop (not ST, not massive MT either).
1769497257892.png

Performance retention on battery /w Balanced profile in GB6 MT:
Snapdragon X Elite: 61.5%
Intel Lunar Lake: 82%

Performance retention on battery /w Best Performance profile in GB6 MT:
Snapdragon X Elite: 74.6%
Intel Lunar Lake: 99.6%

1769498272187.png

Intel has tried quite a few times in the past to dunk on AMD for the exact thing we're discussing here, they claimed their chips were superior because they offered more consistent performance on battery vs AMD. People didn't really care, users are willing to sacrifice some performance for stamina on battery as long as performance is still adequate. (ideally we want user choice, I like how MacOS does this with very good performance retention on default and additional profiles for better battery life and/or noise/thermals)

The part that really sucks here is Window's implementation of power profiles and power settings that need to be changed in multiple places. One good thing they accomplished somewhat recently was to introduce both "Plugged-in" and "On battery" dropdowns for power modes, in the past there was only a "unified" selector that changed value depending on whether the unit was on battery or plugged in, which meant inexperienced reviewers did not even know what profile they were testing with.

Look at this, and prepare to facepalm:
1769499230859.png
 
Last edited:

regen1

Senior member
Aug 28, 2025
351
430
96

Some U5 338H(B370/10Xe3) benchmarks from "Goldenpig Upgrade"

Gaming and Synthetics:
1769503452265.png

1769503483713.png

1769503505240.png

1769503758226.png

1769503773463.png

Cinebench:
1769503596576.png

1769503613208.png

some "AI"
1769503859441.png
 
Last edited:

Covfefe

Member
Jul 23, 2025
101
179
76
That’s because clock speed and performance isn’t linear. Chrome dev tools literally divide the JS performance by 4.
I ran speedometer stock and after throttling in bios. I scored 39.6 and 21.2. conveniently that lines up exactly with the 54% that panther lake scores on battery.
For GPU rendering, it’s relevant because we are talking about battery life while browsing here. Intel is going to throttle the entire chip, CPU and GPU.
The fact that they're throttling CPU in certain applications doesn't mean they're throttling GPU. I'm sure they're power limiting it, but so is AMD and Apple and Qualcomm. There's no indication that Intel is throttling the GPU enough to make the experience choppy. If they were, reviewers would have mentioned it.
 
Last edited:

DavidC1

Platinum Member
Dec 29, 2023
2,138
3,273
106
Ultra 5 338H is the star of the show. 85% of graphics performance. Still 50% faster than predecessor.
Trace the replies back. No Youtuber ever tests battery life on performance mode. It's always the default mode.

PS. Windows does throttle on performance mode as well.
Balanced isn't necessarily slower than performance. The Dell XPS 12 I had ran Starcraft 2 better on battery Balanced than on Performance mode. Noticeably actually. In the 25% range.
 
Last edited:
  • Like
Reactions: sgs_x86

DavidC1

Platinum Member
Dec 29, 2023
2,138
3,273
106
you mean 10?
I mean 8. There's a video where it compares all those generations. 8.1, 10 and 11. It's progressively more bloated, more memory use and such.

Also, laptop battery life will vary depending on vendor, implementation and tested settings too. I doubt the differences will be noticeable overall. I would bet the dual screen implementation means it has some overhead even if one screen is off. Laptops are finicky like that.
 

regen1

Senior member
Aug 28, 2025
351
430
96
Balanced isn't necessarily slower than performance. The Dell XPS 12 I had ran Starcraft 2 better on battery Balanced than on Performance mode. Noticeably actually. In the 25% range.
For gaming something similar was observed here, around 13 minute :


1769514486474.png

For this particular test it was more performant, quieter in "Balanced" mode.
 

DavidC1

Platinum Member
Dec 29, 2023
2,138
3,273
106
For this particular test it was more performant, quieter in "Balanced" mode.
I assume it's because how Turbo 2.0(Sandy Bridge) works, which uses unspent thermal headroom of a cooling system. Performance runs full throttle all the time, thus the heatsink is in average warmer, while Balanced mode allows the SoC to cool down having a reverse effect of being faster.
 

coercitiv

Diamond Member
Jan 24, 2014
7,470
17,850
136
No Youtuber ever tests battery life on performance mode. It's always the default mode.
The Dell XPS 12 I had ran Starcraft 2 better on battery Balanced than on Performance mode. Noticeably actually. In the 25% range.
Dell XPS 13 Elite 1st gen versus LNL, custom battery test with developers workload, max screen brightness and AFAIK Best Performance power mode (as claimed previously during the video):

The Lunar Lake machine outlasted the X Elite model. Now, we could argue that we need to establish a proper baseline for such a test, such as exact same display panel and battery, but like I already hinted in my previous post, even the power modes are not the same on many Windows devices.
 

mikegg

Platinum Member
Jan 30, 2010
2,110
653
136
Dell XPS 13 Elite 1st gen versus LNL, custom battery test with developers workload, max screen brightness and AFAIK Best Performance power mode (as claimed previously during the video):

The Lunar Lake machine outlasted the X Elite model. Now, we could argue that we need to establish a proper baseline for such a test, such as exact same display panel and battery, but like I already hinted in my previous post, even the power modes are not the same on many Windows devices.
These tests are meaningless without the amount of work done.

The Qualcomm version could have done 50% more work before it ran out of battery first. X Elite doesn't throttle (or not nearly as much as LNL) and it has much higher MT performance. So it could very well have done much more work than LNL in the same time.
 

DavidC1

Platinum Member
Dec 29, 2023
2,138
3,273
106
These tests are meaningless without the amount of work done.

The Qualcomm version could have done 50% more work before it ran out of battery first. X Elite doesn't throttle (or not nearly as much as LNL) and it has much higher MT performance. So it could very well have done much more work than LNL in the same time.
@coercitiv Just showed you that Snapdragon throttles more than Lunarlake in certain implementations.
 
  • Like
Reactions: coercitiv

DavidC1

Platinum Member
Dec 29, 2023
2,138
3,273
106
I would think at this point, a separate GPU oriented SKU for desktop with 12 Xe cores might have been interesting. Say a Core Ultra 5 245 had that for $430. That's $130 over the normal chip. On desktops DDR5 goes quite high clocks.
View attachment 137390
Lunar is better, interesting
I looked closer. The 388H is getting 7.7 with 120Hz display. Lunarlake is getting 7.9, so that's a minor difference, not even 5%.
The efficiency comparison becomes interesting with lower TDP values for the Arc B390. The efficiency is almost at the level of the Apple M5 GPU even at 35 W, and this gets even better at 28 W and 20 W.
Same process, same efficiency. GPUs are embarassingly parallel so much easier to catch up, hence why Apple doesn't get decisive advantage in GPU vs competitors.
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
7,470
17,850
136
X Elite doesn't throttle (or not nearly as much as LNL) and it has much higher MT performance.
I have already shown X Elite throttles in some laptop implementations. When using the Best Performance power mode, the Yoga Slim 7x retains only ~75% performance when switching to battery. We can also see the Balanced mode is throttling the X Elite more than Lunar Lake.

A big part of this throttling is the OS and OEM custom implementation, so this will vary a lot from one device to another.