Question x86 and ARM architectures comparison thread.

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jdubs03

Golden Member
Oct 1, 2013
1,440
1,013
136
From that ComputerBase article for the M5. I would say this is the biggest issue that I’ve seen:

Translated:
“With single-core load in the Cinebench, this time an average of 7.6 watts of consumption for 199 points (M5) is compared to a previously average of 5.5 watts for 172 points (M4). Conversely, this results in an additional consumption of around 20 percent for the same score.” If scaled linearly this would be indeed true. I’d imagine this might be overly simplistic.

Which is interesting because the A19 Pro doesn’t exhibit that same relationship compared to it’s predecessor. But scaled linearly it would be roughly the same score at the same power draw. Different benchmark obviously, but Geekbench and Cinebench are pretty well correlated for single core performance. Linearity caveat still applies.
1763536271651.png
But even with this downgrade in efficiency, it’s a completely different ballpark.
 

mikegg

Platinum Member
Jan 30, 2010
2,091
633
136
Yeah, but Qualcomm and even Surface wouldn't exist without Apple doing what they are doing. If not for Apple, the industry would still look like it did in 2010. Apple is what is creating the pressure for that shift to happen. It's Microsoft breaking from the old industry model which AMD/Intel are central to and doing what Apple is doing. Yes, directly the threat is Qualcomm, but Apple is why that relationship even exists and so long as Apple keeps doing what they're doing, Microsoft is almost certainly going to invest more in that relationship than they will in x86. Remember, the equivalent to AMDs win on x86-64 was the A7, which was only 12 years ago. Apple went from the first 64 bit ARM processor (in a phone no less) to dumping Intel in 8 years (which nobody thought was even realistic). We're now about to hit year 6 of the Apple Silicon era. I would hazard to say that Apple is the one in the drivers seat for all mobile/desktop silicon right now, with players like AMD/Intel/Qualcomm/Samsung chasing.
Not disagreeing that Apple was the one who started this.

Qualcomm is the biggest threat to AMD and Intel in the laptop world. Apple is second. Perhaps Nvidia/Mediatek will join and become a big threat as well.

Either way, if AMD and Intel don't get drastically more efficient at an insane rate, they're toast long term.

AMD might just switch to Arm on client side. Rumors are that they're making an Arm SoC.
 

poke01

Diamond Member
Mar 8, 2022
4,749
6,082
106
Your thoughts on the M4 Air vs your experience with Intel/AMD laptops?

theres no other competent fanless laptop. Its good, battery lasts a couple of days. The display controller is very good, its something you notice instantly as soon you plug into an external monitor. No flickering, instantly displays.

macOS is meh though
 

Doug S

Diamond Member
Feb 8, 2020
3,791
6,717
136
Will note that not competing effectively against Apple is why Microsoft created the Surface line, and why they've now got Qualcomm making chips. I keep noting that Microsoft had a choice - they've contracted with AMD to make Xbox silicon but for their PC silicon they not only went to Qualcomm, they've committed massive resources to commit to ARM Windows parity. Why? Why on earth would they do that if AMD could just tweak the formula and deliver an equivalent x86 product? Either AMD can't do that, or AMD won't do that on the terms that Microsoft wants - cost, etc. But what matters is that Microsoft doesn't have that deal, and does have one with Qualcomm. And AMD may not care about MBPs but I'm pretty sure they do care about X2 Elite and X2 Elite Extreme laptops carrying Microsoft logo on them, and that product only exists because the MBP does because Intel/AMD/Dell/HP couldn't get their sh!t together to make a competing product. That Microsoft logo is something that AMD takes seriously because they can't compete against that.

Microsoft created Surface because they'd tried and failed to make tablets a thing multiple times since the early 90s (Windows Pen) and were determined to make Windows tablets a success no matter how many times they had to fall on their face. They were so all-in on Surface that they turned Windows 8 into a touch first abomination of an OS, only to eventually realize their mistake and return to sanity with Windows 10. I don't think it was about competing with Apple, it was about Microsoft's leadership being pissed/jealous that Jobs figured out tablets on his first try when Microsoft had been failing for two decades. It says a lot they effectively conceded failure in the tablet market once again and turned Surface into a laptop you can use as a tablet if you really want to (but was almost universally bought to be used as a laptop)

Qualcomm making chips is about Qualcomm wanting to expand its market since the smartphone market was saturated by the late 2010s, so they needed to look elsewhere for growth. Qualcomm's only real competition is Intel and AMD. People aren't saying "I want an ARM laptop, I need to decide between Apple and Qualcomm". If Qualcomm's market share goes from 1% to 2% they aren't taking that 1% from Apple, they're taking it from Intel/AMD.

It is in Microsoft's interest to support something other than x86. More competition drives down the price of hardware, which increases Microsoft's bottom line. It is as simple as that.

Apple may be out there showing Qualcomm, Microsoft, and the PC OEMs what's possible and giving them something to aim at. But it isn't anyone's competition, it is all a zero sum game within the Windows/PC ecosystem, with Intel. AMD, Microsoft and Qualcomm all making moves they believe best position them to take a bigger chunk of that PC ecosystem revenue. Apple is a simply a bystander mostly competing against itself. Sure they'd like to pull in some new blood from the PC world but that's pretty hard to do, and these days they mostly rely on Microsoft's self owns (like Windows 10 expiring and Windows 11 not being compatible with lots of hardware still happily running Windows 10) to piss off Windows users enough to get them to consider a Mac.
 
  • Like
Reactions: Tlh97 and Joe NYC

NTMBK

Lifer
Nov 14, 2011
10,519
6,028
136
No, lol.
AMD is taping together halo APUs because they have no chance in dGFX laptops against NV.
dGFX laptops are inherently an inefficient design. You've got duplicated GPUs (iGPU and dGPU), duplicated and wasted memory pools, the overheads of high speed IO between CPU and GPU. A single big fat iGPU that can efficiently clock down when not under heavy load is always going to use less board space and produce less overall heat, both ideal for a laptop. It's amazing to me that it's taken AMD this long to take it seriously, it's clearly the right product for the problem space.
 

adroc_thurston

Diamond Member
Jul 2, 2023
8,217
10,950
106
dGFX laptops are inherently an inefficient design
They're alright.
A single big fat iGPU that can efficiently clock down when not under heavy load is always going to use less board space and produce less overall heat, both ideal for a laptop
Oh no, nothing about the big fat iGP is free or inherently efficient.
It's amazing to me that it's taken AMD this long to take it seriously, it's clearly the right product for the problem space.
APUs are expensive and inflexible.
I think they always knew. AMD is now in a position to do (ie financially stable) what they always wanted to do and deliver it to the PC market.
No, they just turned their lower end dGFX into APU tiles because NV evicted them from mobile dGFX forever.
 

Meteor Late

Senior member
Dec 15, 2023
347
382
106
Correct me if I'm wrong, but doesn't the use of a dGPU in a laptop result in a huge idle or near idle power consumption issue? at least when the dGPU is NOT disabled, what I mean is, imagine if I want to use the dGPU lightly for some reason (codec support, etc), isn't the dGPU going to consume a lot more power than the iGPU doing the same thing?
 

NTMBK

Lifer
Nov 14, 2011
10,519
6,028
136
They're alright.

Oh no, nothing about the big fat iGP is free or inherently efficient.

APUs are expensive and inflexible.

No, they just turned their lower end dGFX into APU tiles because NV evicted them from mobile dGFX forever.
I never said it was free, just that it avoids the specific overheads I mentioned in my post (IO power consumption, duplicated memory power consumption and cost, board area).
 

adroc_thurston

Diamond Member
Jul 2, 2023
8,217
10,950
106
I never said it was free, just that it avoids the specific overheads I mentioned in my post (IO power consumption, duplicated memory power consumption and cost, board area).
Those cost/power additions are incremental to the point of irrelevancy.
dGPUs also give you config flexibility.
 

poke01

Diamond Member
Mar 8, 2022
4,749
6,082
106
does anyone here have a Strix point laptop, what’s the package power when running Cinebench 1T?
 

Jan Olšan

Senior member
Jan 12, 2017
619
1,254
136
I think they always knew. AMD is now in a position to do (ie financially stable) what they always wanted to do and deliver it to the PC market.
They tried to push strong iGPUs from the first decade of this century.
Then laptop makers kept ignoring them. They kept putting Nvidia MX trash together with both AMD and Intel chips with the highest iGPU tiers which entire point was to enable dGPU-less notebooks. Dunno if it was Nvidia meddling or just plain old rule of shit happens.

Fast forward 15-20 years and people shout at AMD, why don't you make fast iGPUs!? Isn't it obvious to everybody they are a good idea? (Meanwhile notebook vendors still ignore then.)
 

Jan Olšan

Senior member
Jan 12, 2017
619
1,254
136
I’m not inventing anything.
View attachment 134136
View attachment 134139
View attachment 134137
View attachment 134138
And fair enough.. you want to use that 34W value with that HX 370 entry. Scores 116. So 72% lower ST score and takes over 2x the power draw to do it.

Let’s take a better case, different source:
View attachment 134140
Ok, that’s better 20W sustained. Scores 112 in ST.

Now, maybe notebookcheck is measuring peak power draw, I think that could be the case. Though the M5 peaks I think at 20 W.

But anyway you slice it. It’s not realistic what he’s saying. And to say that I’m making something up is an unfair statement. AMD is far behind. I think you guys just have trouble accepting it.

Wasn't the ST power of Strix point 22W?
In summer I had a highend Arrow Lake laptop in hands that was equipped with a GeForce RTX 5090 mobile. Even though there should supposedly be GPU switching going one, that GPU kept eating like 15W permanently when the laptop was idling, doing light use or just showing desktop (you can imagine the temps and battery life), instead of the 0 W you would expect based on marketing. I hope that was due to a broken configuration and is not the usual case. But getting 45W ST load power for Strix Point feels like a similar case, I would double-check there's not GPU or another component sucking power there. Or, extra cores beyond the one being active...
 

poke01

Diamond Member
Mar 8, 2022
4,749
6,082
106
Wasn't the ST power of Strix point 22W?
In summer I had a highend Arrow Lake laptop in hands that was equipped with a GeForce RTX 5090 mobile. Even though there should supposedly be GPU switching going one, that GPU kept eating like 15W permanently when the laptop was idling, doing light use or just showing desktop (you can imagine the temps and battery life), instead of the 0 W you would expect based on marketing. I hope that was due to a broken configuration and is not the usual case. But getting 45W ST load power for Strix Point feels like a similar case, I would double-check there's not GPU or another component sucking power there. Or, extra cores beyond the one being active...
Keep in mind notebookcheck measures wall power
 

poke01

Diamond Member
Mar 8, 2022
4,749
6,082
106
IMG_2965.png

not bad here since this was tested on a fanless M4. AMD did well too considering it was a HX 350.

Kinda expected better from the 285H but likely running on 50watts.
 

DavidC1

Platinum Member
Dec 29, 2023
2,096
3,219
106
Honestly, AppleSilicon feels even better than the benchmarks would suggest I think mainly due to the degree of integration/optimization Apple has been able to do with their own silicon. I jumped from a top of the line i9 MBP to a M1 Max and yeah, the Max benched faster, but the machine felt 5x faster. I attribute a LOT of that to Apple being able to dump virtually all of the system tasks on the E cores freeing up the P cores completely. So even though the i9 wasn't that much slower than the M1 in benchmarks it had this tendency to throttle back to a single core on which everything was trying to run. There is no scenario where an AS Mac isn't running the E cores full speed, so worst case the system is still responsive even if my front app is lagging.
The 1/2 of the problem is Microsoft, which is why Valve is putting all their might into moving away from them.

The battery life differences from SteamOS versus Windows is astounding, because OS differences shouldn't be that large. This is why I say Microsoft is the perfect partner to Intel, because they are the software side sucking to Intel's hardware side sucking. Actually compared to ARM vendors, AMD is not much better than Intel either. Actually with telemetry and spyware bundled with things like Management Engine, AMD PSP, they are going from sucking to the realm of evil.
Blowing x86 apart is incredibly risky for whichever party does it, which is why no party does it - that's the lesson of Itanium - someone is going to make a big bet and lose. Apple doesn't have to worry about someone undercutting their decision - they only have to worry about their own ability to execute.
Why do we have to worry about what Intel/AMD thinks? That is WHY we have this problem in the first place. It's the fault of the lawyers and the bribery they call lobbying in America. They always sided with Intel. They should have ignored Intel and just opened up x86. And since you are having a hard time of comprehension, I will make it clear - "they" meaning the courts.

If Transmeta did what they did running a translation layer, I wonder how much better they would have done without one? At RWT there was a comparison of 90nm Transmeta with 45nm Atom, and the perf/W were similar. Yes, the in-order Atom was a particular sucky chip, but 2 full generations of process technology resulting in similar results is a huge thing, especially since the Transmeta chip was years earlier.. We saw from just having a 3rd vendor through Cyrix how nimble a team could be. 1/3rd the development time with 50 core engineers. x86 isn't as bloated as much as the companies running them are. ARM has true competition, because anyone come in and make a chip. x86 is almost as bad as a monopoly. They are a duopoly.
Wasn't the ST power of Strix point 22W?
In summer I had a highend Arrow Lake laptop in hands that was equipped with a GeForce RTX 5090 mobile. Even though there should supposedly be GPU switching going one, that GPU kept eating like 15W permanently when the laptop was idling, doing light use or just showing desktop (you can imagine the temps and battery life), instead of the 0 W you would expect based on marketing. I hope that was due to a broken configuration and is not the usual case. But getting 45W ST load power for Strix Point feels like a similar case, I would double-check there's not GPU or another component sucking power there. Or, extra cores beyond the one being active...
That system is indeed using 45W. It's not the chip using 45W, it's the system, so the entire laptop. That particular system is inefficient. Other Strix systems are using little over 30W, which is still phenomenally higher than the Macbook at 15W, meaning SoC power is 3x+ the difference.
 
Last edited: