Discussion Qualcomm Snapdragon Thread

Page 46 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

FlameTail

Platinum Member
Dec 15, 2021
2,356
1,276
106
Screenshot_20240424_193322_YouTube.jpg
What do you guys think about this launch speed advantage?
Screenshot_20240424_193633_YouTube.jpg
Look how the X Elite pulls higher than the X Plus at the top of the curve.
 

FlameTail

Platinum Member
Dec 15, 2021
2,356
1,276
106
Screenshot_20240424_193633_YouTube.jpg
Screenshot_20240424_193650_YouTube.jpg
X Plus' GPU consumes only 20W, about 33% less than X Elite's.

X Plus = 20W = 3.8 TFLOPS
X Elite = 30W = 4.6 TFLOPS

Clearly there has been a large frequency reduction to cut the power consumption by 1/3rd, despite only reducing performance by ~17%
 

FlameTail

Platinum Member
Dec 15, 2021
2,356
1,276
106
Screenshot_20240424_235727_Chrome.jpg
I am yet to confirm the authenticity of this slide, but it is showing the X Elite consuming almost 70W in Cinebench 2024 MT, which is far higher than the 40W in Geekbench 6 MT.
 

FlameTail

Platinum Member
Dec 15, 2021
2,356
1,276
106
Chips and Cheese just dropped an article reviewing the GPU in the Snapdragon 8cx Gen 3.


This is the first time they are reviewing a Snapdragon PC SoC. Enjoy!

PS: I hope they review the Snapdragon X Elite too, when it does arrive!

PS2: Damn, we really had a lot of Qualcomm news today!
 

ikjadoon

Member
Sep 4, 2006
121
175
126
Unsure if I missed the discussion on this, but Qualcomm interestingly ever so slightly lowered their memory speeds for the X Elite.

Compared to their October demos, however, there are a couple of important points to point out – areas where the chip specs have been downgraded. First and foremost, the peak dual core clockspeed on the chip (what Qualcomm calls Dual Core Boost) will only be 4.2GHz, instead of the 4.3GHz clockspeeds we saw in Qualcomm’s early demos. The LPDDR5X memory frequency has also taken an odd hit, with this chip topping out at LPDDR5X-8448, rather than the LPDDR5X-8533 data rate we saw last year. This is all of 85MHz or 1GB/second of memory bandwidth, but it’s an unexpected shift since 8448 isn’t a normal LPDDR5X speed grade.

The GB6.2 1T perf guidance has also lowered and Qualcomm is no longer mentioning their previous 3.2K runs. These are all presumably in the 80W device TDP config.

October 23, 2023:
QCOM says Oryon hits 3227 pts (100%) - ? clocks on ?

October 30, 2023:
QCOM says SXE hits 2939 to 2979 pts (91% to 92%) - 4.3 GHz boost on Windows
QCOM device shows SXE hitting 3236 (100%) - 4.3 GHz boost on Linux

April 24, 2024:
QCOM says SXE hits 2850 to 2900 (88% to 90%) - 4.2 GHz boost on ?

I'm all for limiting boost clocks, but funny Qualcomm promoted this 3,227 score beating its competitors, when Qualcomm's final shipping SoC will either be barely faster or may well lose:

Qualcomm-Snapdragon-X-Elite-vs-mac.jpg

Maybe only matching M2 Max, actually.

Qualcomm-Snapdragon-X-Elite-v-HX.jpg

Maybe actually quite a bit slower than a heavily boosted i9 Raptor Lake

Of course, the headline (much lower power at equal perf) is great, but that bottom chart is becoming more suspicious, particularly when as the launch date moves closer, Qualcomm's internal benchmark scores keep getting lowered.

EDIT: 2024 typo
 
Last edited:

SpudLobby

Senior member
May 18, 2022
680
423
106
And this is why I have been saying all along that X Elite G2 and it's Pegasus core should not increase the clock speed, and should arguably decrease it. Pushing the clocks up so hard is antithetical to a philosophy of efficient computing. In such a philosophy, performance gains should come from IPC increase, and not clock speed increases.
Clocks going up to follow node gains is fine, pushing further not so much.
 

FlameTail

Platinum Member
Dec 15, 2021
2,356
1,276
106
.
During a Cinebench 2024 test, the M3 Pro on normal power mode used a max of 29w on the CPU & 17w on the GPU. While the 16-core M3 Max used a max of 54w on the CPU and 33w on the GPU.
The small M3 Pro with 11 CPU cores consumes around 24 watts (M2 Pro with 10 cores: around 27 watts); the M3 Pro with 12 CPU cores consumes about 27 watts (M2 Pro with 12 cores: around 34 watts). More performance with lower consumption naturally means significantly better efficiency, and the new M3 Pro is clearly the leader in this respect.

We have so far only been able to test the top-of-the-range version of the M3 Max with 16 CPU cores and its performance advantage over the M3 Pro and the old M2 Max is a whopping 54-56 %, meaning Apple has finally gotten rid of a major point of criticism that the old models had. Although its power consumption has also increased significantly (M3 Max up to 56 watts; M2 Max up to 36 watts), its efficiency is still slightly better than the M2 Pro/M2 Max — but the M3 Max didn't come close to the M3 Pro's efficiency in our multi-core tests.
So X Elite consumes more power than M3 Max in CB2024, while also delivering less performance.
 
Last edited:

SpudLobby

Senior member
May 18, 2022
680
423
106
Unsure if I missed the discussion on this, but Qualcomm interestingly ever so slightly lowered their memory speeds for the X Elite.



The GB6.2 1T perf guidance has also lowered and Qualcomm is no longer mentioning their previous 3.2K runs. These are all presumably in the 80W device TDP config.

October 23, 2023:
QCOM says Oryon hits 3227 pts (100%) - ? clocks on ?

October 30, 2023:
QCOM says SXE hits 2939 to 2979 pts (91% to 92%) - 4.3 GHz boost on Windows
QCOM device shows SXE hitting 3236 (100%) - 4.3 GHz boost on Linux

April 24, 2023:
QCOM says SXE hits 2850 to 2900 (88% to 90%) - 4.2 GHz boost on ?

I'm all for limiting boost clocks, but funny Qualcomm promoted this 3,227 score beating its competitors, when Qualcomm's final shipping SoC will either be barely faster or may well lose:

Qualcomm-Snapdragon-X-Elite-vs-mac.jpg

Maybe only matching M2 Max, actually.

Qualcomm-Snapdragon-X-Elite-v-HX.jpg

Maybe actually quite a bit slower than a heavily boosted i9 Raptor Lake

Of course, the headline (much lower power at equal perf) is great, but that bottom chart is becoming more suspicious, particularly when as the launch date moves closer, Qualcomm's internal benchmark scores keep getting lowered.

.


So X Elite consumes more power than M3 Max in CB2024, while also delivering less performance.
It’s the M3 Max. It has 4 more cores and is on N3.

Also I would expect QC to do more poorly on CB24 power curves vs Intel and AMD than GB6, it makes sense. Cinebench is designed perfectly to benefit from HT. GB5 MT would be more interesting.
 

coercitiv

Diamond Member
Jan 24, 2014
6,257
12,197
136
Of course, the headline (much lower power at equal perf) is great, but that bottom chart is becoming more suspicious, particularly when as the launch date moves closer, Qualcomm's internal benchmark scores keep getting lowered.
Making claims when the product is in the lab is easy. It gets increasingly more complicated as the launch window narrows down and the chips must match the specs in the wild. This is the reason people begin raising eyebrows as QC continue their barrage of marketing slides that we're supposed to take at face value.
 
  • Like
Reactions: Nothingness

FlameTail

Platinum Member
Dec 15, 2021
2,356
1,276
106

SpudLobby

Senior member
May 18, 2022
680
423
106
Apple does pretty well in CB2024;
The M3 Max (12P+4E) rivals the Ryzen 7945HX (16P/32T). Both do about ~1600 points.
If you would read, I addressed that. They have a CPU on a better node and with superior perf/W still, and even Qualcomm beats AMD on a per-core basis by a ton, so that’s not surprising. However HT still gives them a significant boost on this app. Not hard.

Isn't GB5 deprecated?
Yeah, but the way they do MT in GB5 is different.
 

SpudLobby

Senior member
May 18, 2022
680
423
106
IMO the X Plus still looks fine. That 2400 GB6 ST can be had for much less power than Hawk Point.

So responsiveness IRL can compete for the likely segments involved and with vastly better battery life.

That said, it was still stupid to do this RE: ST, but pricing and battery life will determine how much it hurts.