Discussion AMD's Future CPU-APU Gone ARM !!!

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

soresu

Diamond Member
Dec 19, 2014
3,299
2,565
136
The most efficient CPU in the world.
Efficiency is more than just single core performance, it's also whole CPU performance.

In that metric AMD Bergamo and the Neoverse V based server/datacenter CPU's are likely the current forerunners.
 
  • Like
Reactions: FlameTail

soresu

Diamond Member
Dec 19, 2014
3,299
2,565
136
Iirc Exynos 2500 will use SF3, not SF3P.

Snapdragon 8 Gen 4 will bring a ~35% uplift over the Snapdragon 8 Gen 3's GPU performance in 3Dmark Wildlife Extreme.
Is this from actual benchmarks or your perception of what it should be based on the semicon process shift?

Node -> node advertised theoretical PPA gains are not an absolute that applies cleanly to complex real world silicon designs.

They are at best an indication of gains in simplified logic and SRAM cells.
 

soresu

Diamond Member
Dec 19, 2014
3,299
2,565
136
It is. But the forum is dominated by long-time x86 DIY people. It's hard for them to admit that.
Plenty here are well versed on the rise of ARM in servers, especially the longtime lurkers and posters.

Also IMHO if the options for ARM DIY were greater there would be plenty of those around here too.

Alas there is a dearth of PC/ATX like options for ARM.

The closest thing is SBCs which are of limited capability or have cost far exceeding their worth.
 

FlameTail

Diamond Member
Dec 15, 2021
4,171
2,523
106
Is this from actual benchmarks or your perception of what it should be based on the semicon process shift?

Node -> node advertised theoretical PPA gains are not an absolute that applies cleanly to complex real world silicon designs.

They are at best an indication of gains in simplified logic and SRAM cells.
Revegnus has long before leaked that 8G4's Adreno 830 GPU will be a major architectural change, and it will be 10% faster than Apple M2.

8G2 : 22 FPS
8G3 : 32 FPS
M2 : 40 FPS, Hence:
8G4 : 44 FPS

8G4 is very powerful. Is it thanks to the 3nm process? Adreno 830 holds a comprehensive 10% performance advantage compared to Apple M2's GPU. In 3DMark Wild Life Extreme, it achieved a score of 7200 points. CPU is also very powerful.
-Revegnus
 
Jul 27, 2020
20,586
14,301
146
It is. But the forum is dominated by long-time x86 DIY people. It's hard for them to admit that.
Give me something that runs all my x86 software at even Haswell speed and gives me a future path forward with desirable native applications that run at blazing speed and all that at Intel Core 7 price and I'm ready to ditch x86. I'll be waiting years before such a product materializes, mostly coz every newcomer makes the mistake of not taking legacy application compatibility seriously. Until your brand spanking new "solution" to x86 woes can do 99.9% x86 compatibility, don't bother. Like seriously. It will get forgotten the moment the next x86 CPU with faster performance is released.

We are not illogical drones with a distortion field around us blinding us to facts. It's simply that all competing products suck. Apple comes close but the prices they charge they think computers should remain restricted to people with big pockets. And they don't care about x86 compatibility either. Qualcomm is making big claims. Let's see how long it takes before they decide they don't want to keep losing money trying to get people to switch to their platform in the wake of faster and faster x86 CPUs being released.
 
Last edited:

soresu

Diamond Member
Dec 19, 2014
3,299
2,565
136
As for ARM is great and will kill x86 every 5 posts, it really gets tiring.
Certainly on the subject of Mac killing Windows.... just isn't gonna happen.

Too much legacy use out there from big to small business and consumers entrenched to Windows for decades.

Plus the whole PC gaming scene - I don't see Mac ever eating into that to a significant degree any more than Linux.

An example for instance.... I know people who work for Sellafield, a major part of the British nuclear energy apparatus.

Some of their software used to be locked to Windows XP until just before COVID, which means that much of their infrastructure was also locked to WinXP

They then moved to Windows 10 - I don't see them upgrading from that for at least a decade+, and I'm sure that there are many other businesses out there in similar circumstances.
 
  • Like
Reactions: igor_kavinski

NTMBK

Lifer
Nov 14, 2011
10,337
5,397
136
Plenty here are well versed on the rise of ARM in servers, especially the longtime lurkers and posters.

Also IMHO if the options for ARM DIY were greater there would be plenty of those around here too.

Alas there is a dearth of PC/ATX like options for ARM.

The closest thing is SBCs which are of limited capability or have cost far exceeding their worth.
If the rise of ARM finally kills ATX, I'll be all in favour of it! It's an awful form factor that makes zero sense for modern systems. Huge number of unused legacy IO slots, awful airflow to the GPU and CPU, badly positioned power supplies, cables running all over the place, and about 4X the volume it actually needs to be.

Bring on a new motherboard architecture, I say. Something DIY that's actually suitable for modern hardware.
 
  • Like
Reactions: Gideon

Glo.

Diamond Member
Apr 25, 2015
5,834
4,839
136
If the rise of ARM finally kills ATX, I'll be all in favour of it! It's an awful form factor that makes zero sense for modern systems. Huge number of unused legacy IO slots, awful airflow to the GPU and CPU, badly positioned power supplies, cables running all over the place, and about 4X the volume it actually needs to be.

Bring on a new motherboard architecture, I say. Something DIY that's actually suitable for modern hardware.
Bring forth a Mac Studio-like form factor.
I know it is hilarious, but yeah, this is what will have to happen.

SBC's should become more prevalent in future. And much more powerful.
 

NTMBK

Lifer
Nov 14, 2011
10,337
5,397
136
I know it is hilarious, but yeah, this is what will have to happen.

SBC's should become more prevalent in future. And much more powerful.
I don't want SBCs to replace DIY though- I just want DIY to be better. Sockets for big fat SoCs with wide memory buses, socketed memory, upgradeable storage.
 

soresu

Diamond Member
Dec 19, 2014
3,299
2,565
136
If the rise of ARM finally kills ATX, I'll be all in favour of it! It's an awful form factor that makes zero sense for modern systems. Huge number of unused legacy IO slots, awful airflow to the GPU and CPU, badly positioned power supplies, cables running all over the place, and about 4X the volume it actually needs to be.

Bring on a new motherboard architecture, I say. Something DIY that's actually suitable for modern hardware.
IMHO it won't be ARM that kills it, but rather the rise of photonic interchip IO and optical PCBs being introduced, starting with servers.
 
Jul 27, 2020
20,586
14,301
146
I just want DIY to be better. Sockets for big fat SoCs with wide memory buses, socketed memory, upgradeable storage.
I would prefer if they create pluggable stuff that you can just push in instead of having to worry about being careful not to bend or even break the connector.

The mobo power connectors are just plain crappy. They are so hard to plug in properly. Need to expend too much force and you are never sure if they went all in coz there's no audible click. Then removing them is even more of a pain and is guaranteed to give you a thumb ache unless you happen to have a small and very strong thumb.

Heatsinks are crazy large.

The M.2 connector while nice, NEEDS to be vertical instead of horizontal so you can easily have more M.2 slots rather than three or four slots taking up the entire bottom portion of an ATX mobo.

Modern GPUs need their own special compartment. Run a PCIe riser cable to that compartment and keep the GPU isolated so it gets its own airflow system instead of its dumped heat causing issues for CPU/RAM/M.2 sticks etc.

It's just awful that we still have a decades old design for cases that need to accommodate components with much higher heat output. I'm sorry if anyone gets offended but engineers are just plain dumb!
 

SpudLobby

Senior member
May 18, 2022
991
684
106
You should at least check the past generation of Exynos as listed below:

EXYNOSNodePrime CPUTotal CoresRAMGPUGPU ArchClockFP32
22004LPECortex-X2 2.8GHz1 + 3 + 4 = 8LPDDR5Xclipse 920RDNA2 6CU1306 MHz1 TF
24004LPP+Cortex-X4 3.2GHz1 + 2 + 3 + 4 = 10LPDDR5XXclipse 940RDNA3 12CU1109 MHz3.4 TF
2500SF3Cortex-X5 3.2GHz ?1 + 5 + 4 = 10LPDDR6 ?Xclipse 950RDNA3+ ?

Hmm, what surprise me is the amount of CU in E2400. Could E2400 also gearing for WoA laptop in the future???
Dude come on.

I’m fully aware they use RDNA.

This is for laptops. The 2500 specifically which is what I was talking about — I doubt it will use RDNA. AMD would protect their laptop market from Samsung and likely that was part of the IP agreement.


Come on guys.
 

Ghostsonplanets

Senior member
Mar 1, 2024
700
1,121
96
Dude come on.

I’m fully aware they use RDNA.

This is for laptops. The 2500 specifically which is what I was talking about — I doubt it will use RDNA. AMD would protect their laptop market from Samsung and likely that was part of the IP agreement.


Come on guys.
IIRC the original licensing agreement had a condition in which Samsung couldn't directly compete in markets AMD operate while using their IP. So you're indeed correct.

Only choice then is to use Mali. If Samsung LSI push for WoA means that Arm will be pushed by vendors to improve software support and GPU IP dramatically to match DX12.2 minimum feature level, so be it. Would be very advantageous to the Windows ecosystem in the long run.

The higher amount of customer choices, the better. And by opening the Windows ecosystem towards being uArch independent, it means others can enter it more easily or even Windows can adapt itself to others uArchs (RISC-V?) more easily and faster.
 

Tigerick

Senior member
Apr 1, 2022
701
628
106
Dude come on.

I’m fully aware they use RDNA.

This is for laptops. The 2500 specifically which is what I was talking about — I doubt it will use RDNA. AMD would protect their laptop market from Samsung and likely that was part of the IP agreement.


Come on guys.
Dude, I got what you meant now. And totally don't understand the logic behind. No one know about agreement thus so called protect laptop market is just pure speculation from discussion between me and @NTMBK here. Now he also wonders why AMD made such decision.

Please step back and think when Samsung and AMD signed the agreement five years ago, I believe the main reason is AMD and Samsung are going to release SoC for WoA. Otherwise, why don't Samsung use Mali from the start? You mean after Samsung released two versions of SoC with RDNA and then Samsung will switch to Mali supporting WoA and then forgo the partnership with AMD. Does it make sense?

https://www.gsmarena.com/exynos_2500_specs_leak-news-61319.php. At least the article mentioned about Xclipse 950, do you have source mentioned anything about Mali?
 

SpudLobby

Senior member
May 18, 2022
991
684
106
Dude, I got what you meant now. And totally don't understand the logic behind. No one know about agreement thus so called protect laptop market is just pure speculation from discussion between me and @NTMBK here. Now he also wonders why AMD made such decision.

Please step back and think when Samsung and AMD signed the agreement five years ago, I believe the main reason is AMD and Samsung are going to release SoC for WoA.
No it wasn’t, good lord.

Otherwise, why don't Samsung use Mali from the start?
They did use Mali in their Exynos SoC’s from the start.
You mean after Samsung released two versions of SoC with RDNA and then Samsung will switch to Mali supporting WoA and then forgo the partnership with AMD. Does it make sense?
There are two Exynos parts:
The 2500 A for phones
The 2500 B for laptops and tablets.

If you would read the latest rumors then you would understand that and wouldn’t be flummoxed into claiming they’re switching over to Mali entirely which would indeed be confusing as opposed to a tablet/laptop Exynos 2500 and a smartphone 2500.



A variant: Octa-core
B variant: Deca-core


https://www.gsmarena.com/exynos_2500_specs_leak-news-61319.php. At least the article mentioned about Xclipse 950, do you have source mentioned anything about Mali?


See above, if they’re doing two different variants that makes complete sense, I’m not claiming they’re going to stop using RDNA tor phones — I am claiming their tablet and laptop SoC variant will use Mali.

AMD has nothing to lose by licensing RDNA to the smartphone market, but licensing it to Samsung or any Arm vendor for PC’s who can use reference IP (IP that is either caught up to or ahead of Zen on perf/GHz and definitely ahead for fanless and low power parts) with RDNA: how is that worthwhile to AMD?

It’s not. Not right now.

So anyway, there are two variants, which not only makes sense for prioritizing laptops and tablet power profiles but gives them the ability to swap Xclipse for Mali, which I think they’ll need to do, and most of the AMD guys here are pretty skeptical (rightly) that the contract permits Samsung to use RDNA For PCs. I don’t think they’re infallible, but I think they’re right on this.
 
  • Like
Reactions: Tlh97

Tigerick

Senior member
Apr 1, 2022
701
628
106
No it wasn’t, good lord.


They did use Mali in their Exynos SoC’s from the start.

There are two Exynos parts:
The 2500 A for phones
The 2500 B for laptops and tablets.

If you would read the latest rumors then you would understand that and wouldn’t be flummoxed into claiming they’re switching over to Mali entirely which would indeed be confusing as opposed to a tablet/laptop Exynos 2500 and a smartphone 2500.



A variant: Octa-core
B variant: Deca-core





See above, if they’re doing two different variants that makes complete sense, I’m not claiming they’re going to stop using RDNA tor phones — I am claiming their tablet and laptop SoC variant will use Mali.

AMD has nothing to lose by licensing RDNA to the smartphone market, but licensing it to Samsung or any Arm vendor for PC’s who can use reference IP (IP that is either caught up to or ahead of Zen on perf/GHz and definitely ahead for fanless and low power parts) with RDNA: how is that worthwhile to AMD?

It’s not. Not right now.

So anyway, there are two variants, which not only makes sense for prioritizing laptops and tablet power profiles but gives them the ability to swap Xclipse for Mali, which I think they’ll need to do, and most of the AMD guys here are pretty skeptical (rightly) that the contract permits Samsung to use RDNA For PCs. I don’t think they’re infallible, but I think they’re right on this.
Ok, let's put your theory in the core die, we have 3 different dies if based on your assumptions:
  1. Exynos with RDNA for Android phone
  2. Exynos with Mali for WoA
  3. AMD ARM with RDNA for WoA
Samsung: Maintain two different SoC design. Handle Mali graphics drivers.

AMD: Design their own ARM SoC. TTM will be late. Provides drivers for Android AND WoA

Conclusion: Samsung will be selling Galaxy Book with Mali graphics. Lose-lose Situation



If based on my speculated situation, AMD and Samsung going to make deeper partnership like Qualcomm/Samsung:
  1. Exynos with RDNA for Android phone
  2. Exynos with RDNA for WoA (most likely same die as I explained)
  3. AMD ARM with RDNA for WoA (modification of E2500, different dies)
Samsung: Maintain single die design and let AMD handles drivers development which is AMD's specialty.

AMD: Fast to market. AMD just modify core CPU and fabbed at SF. Expand RDNA IP to Android and other WoA. Potential integration of 5G modem.

Conclusion: Samsung will be selling Galaxy Book with RDNA graphics. Win-Win Situation

So which scenario is making more business sense, hub?