[Q] ARM vs x86 in consumer space in 10 years

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

johnsonwax

Senior member
Jun 27, 2024
392
589
96
Hmm...I wonder how they could improve that number?

*hint* Windows on ARM and first party Linux driver support *hint*
Apple's been improving that number by undermining desktop sales. Last I saw desktop PCs had fallen below 30% of all PC sales to ~75 million units globally - compare that to Apple's 52 million iPads, and I'd bet anything that Apple's profits off of those 52 million iPads exceeds the aggregate profits off of those ~75 million desktops by quite a wide margin. So to start, why on earth would Apple want to grab a growing share of a declining market? Never try and catch a falling knife. The upside to the desktop segment is gaming PC sales seem to be holding, and they're high ASP. Enterprise PC sales seem to be shrinking slowly as their lifecycle expands and casual consumer PC sales are collapsing to mobile, helped in part by the enshittification of the web.

You guys keep acting like this is a healthy market, and a market worth saving. I know you like it, but that's what it's going to collapse to - PC master race gamers and whatever enterprise is still willing to spend money on. Hobbyists are going to have to find their own way - as they do in every such market, car enthusiasts, etc. I started as a hobbyist and we were the whole damn market back in the 70s. That's how this goes - market grows, hobbyist share shrinks and they get squeezed out. You can wish for first party linux support all you want, but these are hundred billion dollar revenue markets and the first party linux driver support market is some fraction of 1% of this. Capitalism calls that a rounding error and ignores it. I know it's hard to hear that the market doesn't care about you, but the market doesn't care about you. I'd start throwing money at Framework and asking them for ARM - they're literally the only vendor that cares about hobbyists right now.

So I'll repeat a little story. Back in the mid-aughts, tech analysts were pessimistic about Apple's story, because they could look at the size of total consumer tech spending on consumer electronics and couldn't square that with the projections of iPod sales. They had more faith in the consumer electronic spending number than the iPod number. And then every year Apple not only hit but exceeded their iPod numbers and blew up that consumer electronic spending number. The analysts couldn't understand it until the apparel market analysts chimed in - their market was collapsing. Teens weren't asking for clothes, which had been the primary way that young people expressed identity to their peers, and now were asking for iPods. We had segmented these markets around the nature of the good, rather than around the job to be done. To teens, they did the same job - what jeans you wore used to say something about you, but what earbuds you wore now did. Apple siphoned revenue out of the apparel market to such a degree that retailers went under. That was in the blind spot of the tech analysts and the enthusiasts like us.

What I'm getting at is that you see Apple's job to be done to increase desktop marketshare. Apple doesn't see that at all. They see the job to be done to shift money out of the consumer medical device market, or entertainment services market. The desktop PC market is miniscule and low margin. Apple's desktop PC efforts are there to preserve legacy customers, not to capture you or Ford's enterprise contract. Apple needs to hold onto developers and content creators and their offerings reflect that.

You seem to think you are representative of some important market for Apple to chase. Go look at the front page of this now defunct website. That market is not important. That has already been determined.
 

Doug S

Diamond Member
Feb 8, 2020
3,585
6,335
136
Hmm...I wonder how they could improve that number?

*hint* Windows on ARM and first party Linux driver support *hint*

Or they could keep selling the same number of desktop Macs while the global PC desktop market continues to shrink, which is probably their best case outcome. If that shrinks by 33% their 2% is now 3%.

Why should Apple care about dying markets? Might as well suggest adding cablecard support to Apple TV to let them compete for that dying market.
 

NTMBK

Lifer
Nov 14, 2011
10,444
5,813
136
Or they could keep selling the same number of desktop Macs while the global PC desktop market continues to shrink, which is probably their best case outcome. If that shrinks by 33% their 2% is now 3%.

Why should Apple care about dying markets? Might as well suggest adding cablecard support to Apple TV to let them compete for that dying market.
We need a Mac Mini with an ATi All In Wonder, they'll sell tens of units!
 
Jul 27, 2020
28,110
19,175
146
So I'll repeat a little story. Back in the mid-aughts, tech analysts were pessimistic about Apple's story, because they could look at the size of total consumer tech spending on consumer electronics and couldn't square that with the projections of iPod sales. They had more faith in the consumer electronic spending number than the iPod number. And then every year Apple not only hit but exceeded their iPod numbers and blew up that consumer electronic spending number. The analysts couldn't understand it until the apparel market analysts chimed in - their market was collapsing. Teens weren't asking for clothes, which had been the primary way that young people expressed identity to their peers, and now were asking for iPods. We had segmented these markets around the nature of the good, rather than around the job to be done. To teens, they did the same job - what jeans you wore used to say something about you, but what earbuds you wore now did. Apple siphoned revenue out of the apparel market to such a degree that retailers went under. That was in the blind spot of the tech analysts and the enthusiasts like us.
Thanks! So it's the dang teens shaping markets. Need to raise better teens, folks! Buy them old used spare parts to assemble as kids and make this market grow!
 

Jan Olšan

Senior member
Jan 12, 2017
575
1,135
136
ARM is clearly dominant in mobile and that's where the consumer space is headed. x86 still totally dominates server and datacenter markets, but with Qualcomm launching 24 core server SoCs with similar i/o capabilities I think we will see a shift to ARM in that space too.

If intel sorts out its 14nm issues and somehow makes a comeback at 10nm (which is looking less and less likely every year) there is a possibility that they will retain there dominant position in the server space. The problem is most news is pointing to TSMC launching 10nm in 2H2016/1H 2017 while intel will still be launching Kabylake and giving the same core counts and same performance as they did before, plus 2.5-5% maybe. With the huge jumps in performance we see in A9X and Exynos 7420/8890 there is reason to believe that Qualcomm or another ARM vendor could continue to improve performance at that cadence, which mathematically makes it impossible for intel to keep up. Intel isn't a company that can pivot quickly and start making something different so they will likely just continue paying off vendors and using x86 lock in to suck revenue out of their server and datacenter customer. The only people who still buy x86 for consumer crap today are people who either need extreme performance to play games or people who are just too cheap to buy Apple. That doesn't look good for the future.
I like this comment. Back in those years we thought 14nm is the problematic Intel node. Heh.

Oh they (Intel) can. Specially if companies like Apple joins their foundry wagon. Then ARM(besides Apple) get stuck on much higher nodes. because TSMC/Samsung cant afford lower.

Its all a cash and revenue game.
Italics is my note. Hilarious how backwards this opinion ended up being, in hindsight.
 
Last edited:

Cogman

Lifer
Sep 19, 2000
10,286
145
106
It's interesting that nobody called out the biggest arm problem for consumer laptops and desktops (that I saw) and that's the craptastic long term driver support from arm manufacturers. I can still run a GPU/x86 CPU from that era and basically all the accompanying hardware with pretty decent support in Linux.

Yet all the arm stuff is basically ewaste at this point. You'd be stuck on a 3.x specially patched version of the kernel.

It wouldn't even take much to fix this problem, just up streaming drivers. The one exception from the period is the raspberry pi.
 

poke01

Diamond Member
Mar 8, 2022
4,245
5,590
106
It's interesting that nobody called out the biggest arm problem for consumer laptops and desktops (that I saw) and that's the craptastic long term driver support from arm manufacturers. I can still run a GPU/x86 CPU from that era and basically all the accompanying hardware with pretty decent support in Linux
LTS also sucks on Windows with ARM CPUs and don’t even get me started on Qualcomms drivers on windows. ARM on non-Apple platforms is useless.
 

Jan Olšan

Senior member
Jan 12, 2017
575
1,135
136
It's interesting that nobody called out the biggest arm problem for consumer laptops and desktops (that I saw) and that's the craptastic long term driver support from arm manufacturers. I can still run a GPU/x86 CPU from that era and basically all the accompanying hardware with pretty decent support in Linux.

Yet all the arm stuff is basically ewaste at this point. You'd be stuck on a 3.x specially patched version of the kernel.

It wouldn't even take much to fix this problem, just up streaming drivers. The one exception from the period is the raspberry pi.
I think that pitfall of ARM was actually pretty commonly quoted all the time, it just didn't appear in this thread?

Upstreaming doesn't protect you from lack of maintenance leading to eventual breakage of your old drivers, though. Drivers and device support get tossed out of kernel constantly for this reason. The ceaseless versions churn of Linux ecocsystem (with teh related but also orthogonal fragmentation issue) adds to the issue too.

IMHO the Windows model where drivers have a stable API/ABI and can be installed independently of kernel version and build ends up being superior. Well not just somewhat better but the one I would want to see everywhere. This compatibility model helps with application software too, obviously (native Linux game ports, anyone?).
 

johnsonwax

Senior member
Jun 27, 2024
392
589
96
Thanks! So it's the dang teens shaping markets. Need to raise better teens, folks! Buy them old used spare parts to assemble as kids and make this market grow!
I think it's telling that I relay a parable about how Apple looks outside of existing markets for growth and you take it as a reason to complain about teenagers.
 

Cogman

Lifer
Sep 19, 2000
10,286
145
106
I think that pitfall of ARM was actually pretty commonly quoted all the time, it just didn't appear in this thread?

Upstreaming doesn't protect you from lack of maintenance leading to eventual breakage of your old drivers, though. Drivers and device support get tossed out of kernel constantly for this reason. The ceaseless versions churn of Linux ecocsystem (with teh related but also orthogonal fragmentation issue) adds to the issue too.

IMHO the Windows model where drivers have a stable API/ABI and can be installed independently of kernel version and build ends up being superior. Well not just somewhat better but the one I would want to see everywhere. This compatibility model helps with application software too, obviously (native Linux game ports, anyone?).
I agree I'm terms of the windows ABI being a lot easier to deal with.

But I'd say, so long as a driver isn't doing something completely ridiculous, it's very likely to remain in the kernel. In fact, the upstream process helps to make sure your drivers can be easily maintained. It's why today's kernel still has support for things like the PS/2 port or serial ports.

The reason these companies don't upstream, even when they open source their drivers, is because instead of developing a high quality driver they throw together crap just functional enough to work.
 
Jul 27, 2020
28,110
19,175
146
you take it as a reason to complain about teenagers.
It's true. The new generation thinks the old one is dumb but they haven't done anything particularly amazing so far. Non-upgradable Apple hardware destined for landfills is nothing to be proud of and they voted with their money to make Apple think they did the right thing.
 

johnsonwax

Senior member
Jun 27, 2024
392
589
96
It's true. The new generation thinks the old one is dumb but they haven't done anything particularly amazing so far. Non-upgradable Apple hardware destined for landfills is nothing to be proud of and they voted with their money to make Apple think they did the right thing.
Every generation thinks the old one is dumb, and every generation thinks the next one is lazy and entitled. My grandfather assembled his own TV and thinks you're a fool for buying one preassembled. I had a coworker who bitched the day he could no longer buy a printer he could debug with a serial cable. I'm old enough coder to be in the 'if you want it done right, write it in assembler' cohort. There's nothing new here.

And you're still running grievances rather than taking the lesson. The hardware companies are going to expand the market because that's where the real money is, and they're going to leave you behind. That's what they always do. It's not the customers fault they do that - that's how markets work.
 

Thunder 57

Diamond Member
Aug 19, 2007
4,035
6,750
136
Every generation thinks the old one is dumb, and every generation thinks the next one is lazy and entitled. My grandfather assembled his own TV and thinks you're a fool for buying one preassembled. I had a coworker who bitched the day he could no longer buy a printer he could debug with a serial cable. I'm old enough coder to be in the 'if you want it done right, write it in assembler' cohort. There's nothing new here.

And you're still running grievances rather than taking the lesson. The hardware companies are going to expand the market because that's where the real money is, and they're going to leave you behind. That's what they always do. It's not the customers fault they do that - that's how markets work.

IIRC parts of Wolfenstein 3D and Doom were written in assembly because the hardware just wasn't powerful enough even if was written in C. It certainly helped that John Carmack is a wizard.
 
  • Like
Reactions: Tlh97 and 511

Doug S

Diamond Member
Feb 8, 2020
3,585
6,335
136
IIRC parts of Wolfenstein 3D and Doom were written in assembly because the hardware just wasn't powerful enough even if was written in C. It certainly helped that John Carmack is a wizard.

Compilers were far less capable then. If he had access to a compiler as good as today's the spots where he could gain benefit from assembly would be few and far between, and I think he'd be the first to admit that.

The only good case for hand coding today basically comes down to stuff like SIMD since even modern compilers have problems extracting and maximizing the benefit of instruction level parallelism. Almost no one is hand coding sequences of ordinary instructions because it is almost impossible to consistently beat the compiler on any sequence longer than fits in a 24x80 window. It isn't worth tweaking even in a tight inner loop to save one or two cycles - saving a cycle in an inner loop run a billion times and you've saved less than a quarter second at modern CPU clock rates lol

Instead of wasting time writing stuff in assembly (other than the aforementioned SIMD) you'll get far more bang for the buck in algorithm redesign. Not necessarily a complete refactor/rewrite, but hints that are derived from a human's high level understanding of what is going on. For example, using extensions like __builtin_prefetch if you know the data access patterns can result in an order of magnitude improvement in some cases without needing to touch or even understand assembly code. Smaller stuff such as marking branches as likely or unlikely, that sort of thing.
 

Thunder 57

Diamond Member
Aug 19, 2007
4,035
6,750
136
Compilers were far less capable then. If he had access to a compiler as good as today's the spots where he could gain benefit from assembly would be few and far between, and I think he'd be the first to admit that.

The only good case for hand coding today basically comes down to stuff like SIMD since even modern compilers have problems extracting and maximizing the benefit of instruction level parallelism. Almost no one is hand coding sequences of ordinary instructions because it is almost impossible to consistently beat the compiler on any sequence longer than fits in a 24x80 window. It isn't worth tweaking even in a tight inner loop to save one or two cycles - saving a cycle in an inner loop run a billion times and you've saved less than a quarter second at modern CPU clock rates lol

Instead of wasting time writing stuff in assembly (other than the aforementioned SIMD) you'll get far more bang for the buck in algorithm redesign. Not necessarily a complete refactor/rewrite, but hints that are derived from a human's high level understanding of what is going on. For example, using extensions like __builtin_prefetch if you know the data access patterns can result in an order of magnitude improvement in some cases without needing to touch or even understand assembly code. Smaller stuff such as marking branches as likely or unlikely, that sort of thing.

No doubt compiliers today to a better job than even a good human the vast majority of the time. I was speaking to the tools and limitations that existed in 1992-1994. What was funny is that after I posted this I did some searching and found people asking why Doom wasn't written in C++ "like most games are".
 

johnsonwax

Senior member
Jun 27, 2024
392
589
96
We still teach asm programming for embedded development and silicon design. Lots of places where you have pretty rudimentary compilers, and of course if you're designing your own silicon, bootstrapping your own compiler or working in an environment with very limited RAM or an environment with really important hardware security, knowing exactly what the code is doing becomes important. Writing something in C can usually get you there, but it still has a lot of memory overhead that hand coded asm won't have and if you only have a 1K of RAM to work with, that can be important. Also tends to get used a lot in real-time stuff a lot because you can literally count cycles in the code - you don't have to guess what the compiler will do. Also if you are reverse-engineering something, you need to know asm.

It's not like we don't still have a lot of environments with limitations similar to the 90s. (I did my last proper asm project (68K) in that time frame.) Last I checked assembly is still about as commonly used as Rust and PHP, just not on PCs. I think RTKit is mostly programmed in asm and C (the realtime OS inside iPhones that handles the radios, etc.) Generally you don't find it taught much in computer science programs, but it's pretty key in computer engineering and some EE programs. Generally we start with a high level language like python to teach CS concepts, and then go to asm and C to get down to what's happening down at the metal layer, and then teach digital logic and VHDL so you can start designing the metal. There'll be some hardware/software codesign course or two in there. There'll be a realtime course, a parallel and distributed course getting into HPC concepts, maybe a GPU/NPU design course now, etc. Assembly will show up throughout all of that, because that's how you see how the hardware works.
 

Thunder 57

Diamond Member
Aug 19, 2007
4,035
6,750
136
We still teach asm programming for embedded development and silicon design. Lots of places where you have pretty rudimentary compilers, and of course if you're designing your own silicon, bootstrapping your own compiler or working in an environment with very limited RAM or an environment with really important hardware security, knowing exactly what the code is doing becomes important. Writing something in C can usually get you there, but it still has a lot of memory overhead that hand coded asm won't have and if you only have a 1K of RAM to work with, that can be important. Also tends to get used a lot in real-time stuff a lot because you can literally count cycles in the code - you don't have to guess what the compiler will do. Also if you are reverse-engineering something, you need to know asm.

It's not like we don't still have a lot of environments with limitations similar to the 90s. (I did my last proper asm project (68K) in that time frame.) Last I checked assembly is still about as commonly used as Rust and PHP, just not on PCs. I think RTKit is mostly programmed in asm and C (the realtime OS inside iPhones that handles the radios, etc.) Generally you don't find it taught much in computer science programs, but it's pretty key in computer engineering and some EE programs. Generally we start with a high level language like python to teach CS concepts, and then go to asm and C to get down to what's happening down at the metal layer, and then teach digital logic and VHDL so you can start designing the metal. There'll be some hardware/software codesign course or two in there. There'll be a realtime course, a parallel and distributed course getting into HPC concepts, maybe a GPU/NPU design course now, etc. Assembly will show up throughout all of that, because that's how you see how the hardware works.

I don't think assembly is taught much outside of specific cases. Hell looking at some CS courses at some state universities they don't even bother with C/C++. They start with Java. The higher tier state schools still taught C/C++ and in one of my courses included assembly. Had I gone the engineering route I'm sure I would've gotten more in depth with it.
 
  • Like
Reactions: 511

johnsonwax

Senior member
Jun 27, 2024
392
589
96
I don't think assembly is taught much outside of specific cases. Hell looking at some CS courses at some state universities they don't even bother with C/C++. They start with Java. The higher tier state schools still taught C/C++ and in one of my courses included assembly. Had I gone the engineering route I'm sure I would've gotten more in depth with it.
They start with Java because ACM provides a Java model curriculum and the CS AP course is Java based. Back in the mid aughts everyone was convinced Java was going to take over the world and a LOT of intro curricula shifted to Java and only Java. It's a terrible language to use for an intro course, though. But if you're taking transfer students, odds are the students learned Java in the community college so it's held on in the same way that imperial units have.

I helped design a CS curriculum and one of our hills we were willing to die on was not starting with Java. We started with Python because everyone should learn Python (seriously) and the language gets out of your way, plus we could teach procedural and OO and functional programming with it, and they walked out with a pretty useful scripting/rapid development language. From there we went to asm/C and did embedded system programming. Then more C/C++. So by the end of their first year they'd seen 4 languages and by the 2nd year everything was then anchored on subject matter and would either lean on those languages or expect students pick up a new one as needed. We found it pretty hard to teach compiler design without students knowing assembly.

CS programs kind of come in 3 flavors: the old school algorithm/theory anchored where none of the courses centered on learning a language and you'd start out learning Lisp or something (they're dying out, but they'll get a resurgence as AI centric programs with a focus on the math involved), the PC centric ones that assumed everyone would go out and work for Oracle or Microsoft (these always started with Java), and the more engineering heavy ones that figured you could learn the PC stuff on your own but not the HPC stuff, compiler design, embedded, etc. What department the program originated from usually informs you which kind of program it is. If it hangs off of engineering (I was in engineering when we built that program) you'll get the last one. If it hangs off the math department, which a lot of early CS programs did you get the first one. If it comes more broadly out of LA&S, you'll get the middle one. From the engineering side, we figured the PC centric jobs were the most fragile and fungible, and also the easiest to self-learn. The jobs you didn't see were the ones you needed to train students for - the realtime failsafe systems that keep airliners from falling out of the sky - and we did send more students to Boeing and Airbus than Microsoft and Oracle. The engineering ones tended to include a decent amount of software engineering as well. The PC centric ones might, perhaps, encourage you to use version control.
 
  • Like
Reactions: igor_kavinski

Thunder 57

Diamond Member
Aug 19, 2007
4,035
6,750
136
They start with Java because ACM provides a Java model curriculum and the CS AP course is Java based. Back in the mid aughts everyone was convinced Java was going to take over the world and a LOT of intro curricula shifted to Java and only Java. It's a terrible language to use for an intro course, though. But if you're taking transfer students, odds are the students learned Java in the community college so it's held on in the same way that imperial units have.

I helped design a CS curriculum and one of our hills we were willing to die on was not starting with Java. We started with Python because everyone should learn Python (seriously) and the language gets out of your way, plus we could teach procedural and OO and functional programming with it, and they walked out with a pretty useful scripting/rapid development language. From there we went to asm/C and did embedded system programming. Then more C/C++. So by the end of their first year they'd seen 4 languages and by the 2nd year everything was then anchored on subject matter and would either lean on those languages or expect students pick up a new one as needed. We found it pretty hard to teach compiler design without students knowing assembly.

CS programs kind of come in 3 flavors: the old school algorithm/theory anchored where none of the courses centered on learning a language and you'd start out learning Lisp or something (they're dying out, but they'll get a resurgence as AI centric programs with a focus on the math involved), the PC centric ones that assumed everyone would go out and work for Oracle or Microsoft (these always started with Java), and the more engineering heavy ones that figured you could learn the PC stuff on your own but not the HPC stuff, compiler design, embedded, etc. What department the program originated from usually informs you which kind of program it is. If it hangs off of engineering (I was in engineering when we built that program) you'll get the last one. If it hangs off the math department, which a lot of early CS programs did you get the first one. If it comes more broadly out of LA&S, you'll get the middle one. From the engineering side, we figured the PC centric jobs were the most fragile and fungible, and also the easiest to self-learn. The jobs you didn't see were the ones you needed to train students for - the realtime failsafe systems that keep airliners from falling out of the sky - and we did send more students to Boeing and Airbus than Microsoft and Oracle. The engineering ones tended to include a decent amount of software engineering as well. The PC centric ones might, perhaps, encourage you to use version control.

I started CS exactly in the middle of the aughts. The first class was to weed out those who who wouldn't make it. Probabably a 50% drop rate. Next was OOP where many more made it. It got a lot more difficult after that but those who were meant for it made it. There was even a hot girl who asked to study together. How could I resisit?
 
  • Haha
Reactions: igor_kavinski

mikegg

Golden Member
Jan 30, 2010
1,976
577
136
However, if you are looking at strictly desktops, desktop Macs represent only around 15% of total Mac sales, meaning global Mac desktop sales are probably around 2% of global PC sales.
I'm surprised it's 15%. I would have guessed 5%.
 

Aeonsim

Junior Member
May 10, 2020
15
42
91
The biggest flaw for ARM at the moment is the whole boot/bios/UEFI/DeviceTree situation. For the most part it appears that ARM devices do not have a generic standardized method of describe the hardware and how to start it up, add devices and other components.

It's not the case where every possible component has a stable driver and you can mix and matches various components to build a system. If you look at an x86 laptop/desktop/server it doesn't really matter who built the laptop what components they chose when making the device you can slap Windows, Linux, BSD, and any number of hobby OS projects on the device and it will mostly work. The system will boot and if it's got suitable drivers you'll be able to use most of the features and components. This is not the case with current ARM laptops, Pi's and the various ARM SBC devices (there being a dearth of bigger ARM desktops to test). There is no guarantee that any ARM64 Linux distro or Win ARM build will work on your device unless you've specifically double checked that the OS build has specific support for your entire device. As a result many of the ARM SBC's and other systems are stuck on archaic versions of the Linux kernel and are not supported by any Windows ARM build.

Until they solve that, and drastically improve their driver and cross-compatibility issues I can't see full blown ARM desktops ever becoming anything more than prebuilt devices where your OS choice is 100% tied to the device manufacturers willingness to support a specific OS and there willingness to continue updating the software for that device. Even for the ARM servers it's mostly in house hyperscalers who have there personal ARM SoC or CPU with there own ARM Linux Distro. I can install a modern Linux distro on a 15+ year old x86 desktop with out issues (assuming it has enough RAM), and I expect that much the same will hold in 10 years or so. Anyone willing to bet they could do the same with a ARM workstation?
 

Thunder 57

Diamond Member
Aug 19, 2007
4,035
6,750
136
By request, the hot girlAs I was replying my computer died. I suspect it's the PSU but even under low load this happens at times. Weird. Glad it is Seasonic and under warranty.

I was asked about the hot girl. She is/was from South America, I think Brazil. It was over 10 years ago. I remember her name though, and it appears she lives and works in Tennessee.
 
  • Like
Reactions: NTMBK