• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion Qualcomm Snapdragon Thread

Page 132 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I wish I could say I'm surprised, but I'm not.

I think that ARM have gone nuts on licensing costs for post v8-A cores.

The fact that all the lower end SoC ODM's seem to be something like 4-6 generations behind state of the art IP in the lower end tells a nasty tale.

Amlogic's A76 SoC came even later than RK3588 and on a cheaper node to boot.

Despite how late RK3588 was still not a peep about any future successor.

You sure that's ARM's fault? How big are these lower end SoCs and what process are they made on? It used to be that smaller processes cut your cost per transistor almost in half, but that's been declining over the past decade or more and now with N3 the cost per transistor is hardly decreasing at all between the inability to shrink SRAM cells and the declining ability to shrink logic combined with the higher per wafer cost due to more EUV layers.

If ARM prices themselves out of the market these lower end SoCs will start using RISC-V cores. ARM doesn't want that kind of thing to get started, because once it does it will only snowball as RISC-V support in Android improves, more RISC-V core designs are available cheap (or even free if someone like TSMC or Snypnoses commissions the core designs and offers them to their customers as part of the package)
 
"As little as $700" is not going to fly for a lot of people.
The big question is if they want to target this low-end market. X Elite competes well against real laptops, so, at least for now, their Chromebook-worth days are over.

I see it repeatedly as if they had to have a cheaper option. Do they? If the mobile market is any indicator, people who buy a $1000 device will gladly buy it for $1100 or $1200, provided it is the best of the best. On the other hand, people on a budget hardly tolerate any increases, preferring to downgrade. I don't see how targeting the latter seems like a good strategy. Shipping silicon is not getting any cheaper.

Qualcomm are greedy bugs and always have been.
Do you mean ten years ago Qualcomm or 2024 Qualcomm?

I mean, it always strikes me as odd how much hate they get despite having changed a lot in the past years (although still terribly bad on many fronts), while people seem much nicer with companies like Intel and Apple, the former which has a borderline abusive relation with its customers and the latter known for its 8GB = $200 joke.
 
Purwa based 8-core X Plus SoC leaked:
gsmarena_001 (2).jpg
I am surprised it still has the same external display capabilities as the X Elite (3 × 4K60). I was expecting the Purwa die to be reduced to 2 × 4K60).

A previous leak from Android Authority suggested that the Purwa die would also have 4 less PCI lanes, and lesser video encode/decode capabilities (4K30/4K60 on Purwa vs 4K60/4K120 on Hamoa).

 
Currently released SKUs:

X1E-00-1DE
X1E-84-100
X1E-80-100
X1E-78-100

X1P-64-100

Rumoured SKUs;

X1E-76-100

X1P-62-100
X1P-56-100
X1P-46-100
X1P-44-100
X1P-42-100
X1P-40-100
X1P-39-100

X1-24-100
X1-00-001

WHY ARE THERE SO MANY
 
You sure that's ARM's fault? How big are these lower end SoCs and what process are they made on? It used to be that smaller processes cut your cost per transistor almost in half, but that's been declining over the past decade or more and now with N3 the cost per transistor is hardly decreasing at all between the inability to shrink SRAM cells and the declining ability to shrink logic combined with the higher per wafer cost due to more EUV layers.

If ARM prices themselves out of the market these lower end SoCs will start using RISC-V cores. ARM doesn't want that kind of thing to get started, because once it does it will only snowball as RISC-V support in Android improves, more RISC-V core designs are available cheap (or even free if someone like TSMC or Snypnoses commissions the core designs and offers them to their customers as part of the package)
Even without Google, Huawei and Xiaomi would likely support RISC-V and with that people would start to move to RISC-V
 
That's what THEY have to figure out.
Do they want this lower margin business or not. They can try and see how they like it.
Qualcomm has been on the market for a few years, mostly in this budget space. They probably know it better than you and me.

Also, take a look at sub $700 laptops at Best Buy. Almost all of them ship with Alder Lake-U, 2.5 year-old parts at this point. It seems natural to see some of those X Plus Gen 1 parts eventually creeping into budget laptops, but I don't understand why many imply they have to rush to get there right now, especially given the razor-thin margins. If anything, it looks like they want to go away from it.
 
ARM doesn't want that kind of thing to get started, because once it does it will only snowball as RISC-V support in Android improves
The thing is that RISC-V suffers the same potential problem x86 and MIPS does with Android.

Sure a lot of apps are made with the JIT/AOT bound ART SDK, but many others, especially the popular and perf dependent ones are made solely in ARM64 code using the NDK, and their devs won't ever make any effort to amend that without Google forcing them to do so.

Short of RISC-V interested parties ganging together and building a fast, accurate emulation layer for ARM64 -> RV I don't see the situation changing in their favor any time soon.
 
Reddit being successful does not preclude these forums surviving. Your speculation about the Zen 5 thread is also incorrect, as the Apple, Meteor Lake, and Alder Lake threads all have millions of views too.

The forums get more than enough traffic to pay for the servers and bandwidth is my contention.
May I ask how many views this thread has? Just curious. 😀
 
Compared to the 3.8 TFLOPS and the 4.6 TFLOPS X1-85 GPU variants, this little iGPU does not just run at lower clock speeds but also has fewer unified shaders at its disposal, with 768 or 1024 being the most likely number.
It's most likely 768.
X1-854.6 TFLOPS1536 ALUs1.5 GHz~30W
X1-853.8 TFLOPS1536 ALUs1.25 GHz~20W
X1-451.7 TFLOPS768 ALUs1.25 GHz~10W
The underlying architecture is reportedly (ChipsAndCheese) in many respects the same as what was used in the Adreno 730
It's based on Adreno 740
The much faster 3.8 TFLOPS X1-85 iGPU delivers barely playable 23 fps in both Cyberpunk 2077 2.1 Phantom Liberty (1080p, Low) and Baldur's Gate 3 (1080p, Low). Since the X1-45 is going to be about half as fast at best, it's safe to say its gaming performance will only be sufficient for older games at resolutions such as HD 720p.
Yeah, you aren't going to be doing much gaming with this 1.7 TFLOPS GPU.
The Snapdragon X Plus 8-core is predicted to consume no more than 30 W, including the on-chip RAM. If true, it is highly unlikely that its iGPU will ever get to eat more than 20 W.
I estimate the iGPU will eat only about 10W. Very low power consumption, but the performance is also very low, so nothing to write home about.
 
Last edited:
Back
Top