• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion Qualcomm Snapdragon Thread

Page 22 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
BOMBSHELL EXCLUSIVE NEWS


Microsoft will announce the next gen Surface Pro and Surface Laptop with Snapdragon X Elite next month.
These won’t be available till June. What’s the point of announcing 3 months early?
 
These won’t be available till June. What’s the point of announcing 3 months early?

To make it known to the people who want the ARM version, so that they can wait for it?

The interesting thing is the ARM version X Elite) will be both more powerful and more efficient than the x86 version (Meteor Lake). Isn't that remarkable?
 
To make it known to the people who want the ARM version, so that they can wait for it?

The interesting thing is the ARM version X Elite) will be both more powerful and more efficient than the x86 version (Meteor Lake). Isn't that remarkable?
Yep, meteor lake is not all efficient when compared to X elite
 

New Chips and Cheese article just dropped!

All of the SPs share a L2 cache, which sees capacity doubled to 256 KB.

Whaaaat? 8+ Gen 1 iGPU only has 256 KB of L2 cache? That is less than the minimum amount of L2 cache dictated by ARM for the Mali G710

chrome_screenshot_1709730831739.png

Edit: Reading further down the article, it seems Qualcomm makes up for the smaller L2 by utilising a big 2 MB slice of GEMM.
 
Last edited:
Overall, it seems Qualcomm prioritises graphics in their Adreno GPUs, with less priority given to compute.

That makes sense, as mobile GPUs don't need much GPU compute power. Gaming is probably the single most GPU intensive task for 99% of smartphone users.

But how does this bode for Qualcomm's Snapdragon X processors and their renewed foray into the PC?

As Nvidia and AMD have demonstrated, Graphics and Compute are equally important for a PC-class GPU.
 
This is confirmed by benchmarking RDNA2 vs Adreno

chrome_screenshot_1709754895541.pngScreenshot_20240307_012740_Gallery.jpg

Adreno comes out on top in the graphics benchmark, but Xclipse RDNA comes out on top in the OpenCL compute test.
 
I fully expect Apple to negate the existence of WoA, they are past the need for making thing like Bootcamp and it works very well virtualized.

I also fully expect Open Core to try and manage with time, booting MacOS on Qualcomm stuff.
And of course Apple will try to block them.
Lmao
 
90+% of WoA interest has little to do with Apple Silicon Macs emulating it or even running it in a bootcamp mode IF they even rebuilt the Metal for Windows or adapted DX12 for their GPUs which is not happening, ergo emulation only.

The interest is about having something similar to Apple’s chips in principle — AKA, *better* than what AMD and Intel can give us. Most won’t even consider macOS or Apple’s marginal pricing anyways, it’s a separate market of customers mostly, so it’s not about competing with Apple so much as doing what they do in architectural focus, but for the Windows market which has no such thing.

Qualcomm could ruin that by charging too much, but it doesn’t change that the directional effect of having multiple vendors for chips is going to be more reasonable pricing. You see this in Dell’s SSD or LPDDR5 prices vs Apple’s — Apple is in their own world. This is also why many won’t buy what they have but would buy a better Windows laptop.

WOA for Macs doesn’t mean a thing.
 
Last edited:

Nvidia has already established itself as the king of Cloud AI.

But the battle for the kingship in Edge AI is still ongoing.

Edge AI refers to the AI that runs locally on client devices like computers/laptops/phones as opposed to Cloud AI, which runs on servers.

This article details the various strengths of Qualcomm that puts it in a position to potentially dominate Edge AI.
 
Galaxy Book 4 Edge with Snapdragon X Elite has been spotted in Geekbench


This page also contains numbers of the individual subtests, which you may want to have a look at if it is in your interest.
 
There also is a device Lenovo 83ED with an X1E78100


The single core score is significantly lower (more than the advertised 3.4Ghz vs 4.0GHz frequencies).
 
There also is a device Lenovo 83ED with an X1E78100


The single core score is significantly lower (more than the advertised 3.4Ghz vs 4.0GHz frequencies).

Yeah that one is an outlier.

Also what is "Insyde CRD". LOL


I expect more devices to appear in this list leading upto the release of the Snapdragon X Elite in June.
 

Data published by Canalys shows that for Q4 2023, Samsung brought in 40 percent of Qualcomm’s Snapdragon chipset revenue. Xiaomi secured second place by accounting for 17 percent of the total revenue, but there is a major disparity between the two companies, and it needs to be addressed. In comparison, MediaTek has a more balanced outlook, with Samsung bringing in a quarter of its total smartphone chipset revenue, followed by Xiaomi at 17 percent.

This is quite frankly surprising, as Samsung has been gradually getting of Snapdragon in their midrange and budget phones, yet they are responsible for 22% of Qualcomm's shipments and 40% of revenue!?

Also the chart by Canalys is the real good stuff:

Samsung-accounts-for-40-percent-of-Qualcomms-smartphone-chipset-revenue.jpg
 

Another Galaxy Book 4 Edge result has been spotted.

This one seems to be valid though, unlike the previous one which was deemed 'invalid' due to some timer issue
Feels somewhat sad that low power laptop chips is spanking my 5900X.
 

Do you guys think the Snapdragon X Plus is it's own die, or derived from the X Elite's die?
 
Then what are those numbered sub-SKUs under X Plus and X Elite?

Different clock speeds? Different bin quality?
Different speeds and possibly lower/higher power versions. Bin quality plays a role there. I don't think it makes sense to make two chips. The second one would have to be much smaller and at that point it would be budget mobile chip (or Chromebook chip at best).
 
Back
Top