Question Is "the writing on the wall" - the end for enthusiasts? [LTT]

VirtualLarry

No Lifer
Aug 25, 2001
56,326
10,034
126

Discussing if Apple's M1 silicon is the future of computing, and discrete CPU and GPU days are numbered.

Maybe this should go in the Apple M series silicon thread, sorry.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
It really depends on whether there's a market for it in the future. If people will pay for a case, PSU, motherboard, CPU, RAM, and dGPU then it'll still happen. The M1 is so niche for a lot of people though. I can't buy one and replace my PC with it from 2019 no matter how good is the hardware. Other companies would have to start making massive consumer APUs like Apple to run the software I wanted to run for all this to happen.
 

coercitiv

Diamond Member
Jan 24, 2014
6,187
11,855
136
To me it seems the video raises a few discussion points that are certainly worth exploring on their own, but the conclusion doesn't really follow. The PC at it's core is about modularity. The CPU ISA is just a matter of tradition, and APUs have been proven again and again to have both advantages and disadvantages over the CPU / dGPU split. They scale differently, they require different thermal and power profiles. The PC market gave birth to anything from handheld units to 1000W+ workstations. The DIY market allows you to build anything: SFF units, fully passive desktop units, disco ball RGB edition skeleton rigs. Storage, cooling, I/O, compute, everything can be scaled. It's an entire ecosystem with so many species that couldn't care less about the CPU ISA or GPU architecture inside of them.

Even the insane power numbers we're seeing from next gen PCs should be seen in their context - strong competition. The PC may be on a weird path for now, but it's evolving at a staggering pace. I wouldn't underestimate it's power to adapt, maybe even overcome.
 

JasonLD

Senior member
Aug 22, 2017
485
445
136
Intel and AMD (Nvidia will probably follow suit) are both planning big APU designs for hyperscalers starting next year and some of the designs will definitely trickle down to consumer products. I like the idea of having a single 600W APU with huge unified LPDDR5-6 based memory instead of CPU+dGPU solution. Not saying it will ever replace CPU+dGPU combo, but it will be an interesting alternative.
 

Tech Junky

Diamond Member
Jan 27, 2022
3,407
1,142
106
it's evolving at a staggering pace.

I didn't watch the video but form the comments it seems we're talking about tiled CPU dies. Whether this stems from Apple or is just the natural evolution it's coming from Intel in 2024 with Arrow Lake in the form of chiplets.

When looking at the M1/M2 in terms of size compared to Intel/AMD currently it's a monster. The refined chiplet though won't be much bigger than the current Intel and future AMD coming this fall. Being able to shuffle components on the DIE though brings more flexibility w/o having to resize the DIE footprint with each generation which might being some stability to the MOBO side.

Being able to add features to the MOBO as they come along and just transplant the CPU to the new board might bring interesting innovations to the market. On the flip side as the CPUs advance being able to just swap them would be convenient too.

There are some obvious upgrades coming in the next couple of years from TB5 to PCIE6 or WIFI7. 2/3 of them are just card options to be added or embedded into new board designs. The PCIE would require a replacement to get use of though.

Things aren't like they were back 20-30 years ago where simple upgrades to pieces of the puzzle get you by. In the span of ~10 years TB went from being an niche product with 10gbps of bandwidth to 40gbps with USB support. Not even that long ago the TB controllers 6th gen option only did TB and was somewhat slow in comparison. TB4 8000 series peripherals still haven't shown up but, combining both protocols into a single controller is kind of a big deal compared to USB alone.

Doubling bandwidths on the PCIE side 2 times in the past 5 years or so is huge for the potential uses besides just for the GPU. On the AMD side with enabling bifurcation you could slot a card into the board and use 4 x4 individual drives or setup a raid without the need for a PLX switch.

Being able to beef up the CPU die with say a ARC or RTX iGPU could be enticing as well. It's going to get interesting.err...is interesting to see how things are progressing right now and in the near future. Different approaches becoming an option. Refresh cycles for major changes seem to be about 2-3 years at this point (for me at least). With the pace of things ever increasing the cycles are collapsing inwards considerably.

Now, with the scale of everything today and moving forward increasing the DIE size to make room for DDR6 could make for some blistering performance or a molten puddle of metal if done incorrectly. Cooling will need some assistance in redesigning things to make sure there's ample transfer to keep things in line. Looking at what's being offered / talked about with M2 Gen5 drives being on a board with active cooling is an indicator of where things are going with the temps being produced by these speedy devices. Considering there's a heat sink for M2's is something we never thought of or saw much of with spinners / SSD's is something to be said as well.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
It's an entire ecosystem with so many species that couldn't care less about the CPU ISA or GPU architecture inside of them.

Yup. Who says PC have to stay x86 forever? I'm old enough to remember when there where plenty of competing architectures around.

If someone is willing to make an ARM PC to the same ATX specification as a normal x86 PC, I don't see why it wouldn't be able to compete. Just look at how much you can use a PI for. Even MS has an ARM version of Windows (almost) ready. Apple has shown x86-to-ARM translation isn't a problem at all.

Now APUs have undoubted advantages in mobile applications, and that's where the market is currently. So that is where the development money goes.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
Yup. Who says PC have to stay x86 forever? I'm old enough to remember when there where plenty of competing architectures around.

If someone is willing to make an ARM PC to the same ATX specification as a normal x86 PC, I don't see why it wouldn't be able to compete. Just look at how much you can use a PI for. Even MS has an ARM version of Windows (almost) ready. Apple has shown x86-to-ARM translation isn't a problem at all.

Now APUs have undoubted advantages in mobile applications, and that's where the market is currently. So that is where the development money goes.

I mean, just look at servers. https://www.servethehome.com/hpe-proliant-rl300-gen11-ampere-altra-max-arm-servers-launched/ A HPE server with an ARM CPU looks just like an x86 one. I wouldn't be at all surprised if Qualcomm make desktop chips from the tech they're building for laptops.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
We may be living in a bubble of PC enthusiasts, but the reality for this segment is grim.

DIY is dying. GPU shipments go systematically down in DIY, in the longer end, and OEM computer sales go up.

From technological point of view - we are getting into an era in which accelerators bring much higher performance uplifts than scaling more cores on CPUs/GPUs. The products will naturally move into the direction with more accelerators on SOCs.

Secondly, mobile is outselling desktop for quite some time, and OEM computer PCs are also outselling any DIY nieche.

Is it really hard to imagine soldered to Mobos SOCs that will land on desktop markets, at some point? Powerful SOCs/APUs, soldered to Mobos, sold by OEMs as their own products, like Mac Studio, Mac Mini, etc.

Threadripper APUs, Ryzen APUs. I think the paint is indeed on the wall where we are heading. We should embrace this change instead of resisting it.

Especially when we take into a perspective the software work that is being layed down by for example Microsoft with Windows 11. AI integration, AR/VR integration into our lives. All of this will have to be powered by most efficient hardware. SOCs with unified memory, and Accelerators - it will be way more efficient than separate CPUs, GPUs, etc.


And don't think its impossible to get scalable, for example APUs.

Is it hard to imagine that AMD would be able to combine together, for example Rembrandt Monolithic APUs to achieve 16 CPU cores, 1536 ALUs and quad channel memory?
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Secondly, mobile is outselling desktop for quite some time, and OEM computer PCs are also outselling any DIY nieche.

A lot of OEM towers are using plain DIY standards under the hood. That is the purpose of standards. To ensure things work together, and when there are already standards in place, those are what you use for cost reasons. Since it'll cost more to come up with your own. If something is lacking, it's a lot cheaper to customise something that already exists.

We should embrace this change instead of resisting it.

I can only answer for myself, but the reason I got into the whole DIY/custom business was being able to get something customised to fit my needs and/or use case. OEMs are very good at providing generic Just Works™ stuff. But if you need something for a specific task, you may find their options limited in utility.

It was never about either cost or convenience. I don't think the customisation angle of being an entusiast is going anywhere anytime soon. Even if the hardware becomes ever more integrated. Remember when you needed to have a dedicated card to just communicate with a HDD? Adding a whole controller card for a CD-ROM drive? Those controllers were integrated into the chipset a very long time ago, but they used to be separate. A more recent example would be memory controllers. Those used to be on the chipset, and had -very- different performance characteristics depending on which chipset you used. (I'm looking at you VIA and SiS...)

Integration and miniaturisation is a core component in the industry. You can see it today with the various accelerators/modems/etc. being integrated into SoCs. They would have been separate chips just a few years ago.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,686
1,221
136

Ajay

Lifer
Jan 8, 2001
15,429
7,849
136
Yup. Who says PC have to stay x86 forever? I'm old enough to remember when there where plenty of competing architectures around.

If someone is willing to make an ARM PC to the same ATX specification as a normal x86 PC, I don't see why it wouldn't be able to compete. Just look at how much you can use a PI for. Even MS has an ARM version of Windows (almost) ready. Apple has shown x86-to-ARM translation isn't a problem at all.

Now APUs have undoubted advantages in mobile applications, and that's where the market is currently. So that is where the development money goes.
We’ll, Apple is already making Arm PCs that are Competitive with x86. It just that they don’t put sell their CPUs on the open market because they are critical to the whole Apple ecosystem. Of course, it’ll be a while before top games will make it to the Mac market as Apple chose their own Metal API over Vulcan. Anyway, the point being that it is possible for Arm to complete in the PC market - it’ll just take a major ARM vendor (only Qualcomm for now) to deliver significantly more powerful SoCs - and a PCIe interface for now for GPUs. Later on they or someone else, will build a large SoC with midrange graphics power that will satisfy all but the most hardcore gamers. It all about having some ARM SoC designer deciding to provide the internal funding, like Apple, to create a successful high performance CPU for desktop use. MS is doing what it can to prepare for that possible eventually.
 

Tech Junky

Diamond Member
Jan 27, 2022
3,407
1,142
106
The future of DIY is probably going to shift towards slot-based rather than socket-based.
I would be down for this but, Intel needs to unlock bifurcation for this to be a true option if all of the slots are X16 for versatility. Putting in a wired x1 or x4 but still consuming 16 lanes is dumb. AMD does bifurcation but, there's a price gap in system building.

IIRC in a standard ATX you can have 7 slots and if you roll some of the remedial options into the CPU tiles that gives you plenty to play with for card options. The downside for some would be the loss of slots due to the double slot GPU options.

I think some more creative ways of using PCIE though are in the works since some of the newer controllers / chips are using PCIE tunneling. An easier solution for Intel would be just boost the bandwidth on the DMI and route all PCIE communications through that to dynamically allocate bandwidth as needed.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Later on they or someone else, will build a large SoC with midrange graphics power that will satisfy all but the most hardcore gamers.

While not SoCs as such, Intel has been doing that since the GMA900 in 2004. Nobody but "gamers" -need- more performance then what a basic IGP offers. Putting an image on the monitor is plenty enough for 95% of users.

I'm specifically excluding the previous "Extreme"* Graphics (both 1 and 2) since those were... ahm... challenged in that department. They didn't even get basic GDI+ right either...

*The only thing extreme were how poor those things where... :rolleyes:
 

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
Watch the video then. I'm addressing the video PoV, not your opinion on how the tech will evolve.

Eh, thats asking a lot when its Anthony (or anyone) at LTT these days. Anyway, my off the cuff take just from the title is, oh no another sky is falling hot take on "PC" computing. Similar to the endless PC gaming is dead claims of the past 20 years now. Call me skeptical.
 

pakotlar

Senior member
Aug 22, 2003
731
187
116
The future of DIY is probably going to shift towards slot-based rather than socket-based.


With the Ghost Canyon-esque going absurdo.

Just make it bigger (fan), more powerful (12VHPWR), etc.

I cannot wait for double triple wide double tall deluxe canyon with a 600W+ CPU SoC and 600W+ GPU SoC with a nuclear reactor attached.

I realize you’re being sarcastic, but whatever allows us to get large enough coolers to support 600W+ I’m for, as long as its not the cost of efficiency per watt. Transistor performance scaling isnt good enough to avoid increased power unless we’re willing to go back to 5-10% annual performance increases, which I am not. CPUs following GPU power consumption trajectory seems natural.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
And for those of use who are deaf. Even captioning is not always there or easy to read or understand,.

Poor captioning is a problem allround. It's just inexcusable.

Don't even get me started on subtitles. Particularly English-to-Danish. No, that's not what that line means or even what was said. Some "translators" should just return straight to a classroom.
 

Tech Junky

Diamond Member
Jan 27, 2022
3,407
1,142
106
While not SoCs as such, Intel has been doing that since the GMA900 in 2004. Nobody but "gamers" -need- more performance then what a basic IGP offers. Putting an image on the monitor is plenty enough for 95% of users.
I'm finding good performance with the Iris / laptop version at least. It's much better at handling mundane tasks than the UHD series was on prior than ADL options. UHD forced ne to put apps into graphics settings to specify the dGPU to get no stuttering from even scrolling webpages or web based games.

Iris should really solve the 95% crowd issues compared to prior offerings. It will get interesting as Intel rolls ARC into the CPU / laptop options boosting the output quality significantly. Being realistic though it's not going to be as good as a base 3050 option but, it's better than just the iGPU / Iris as an in between / dGPU offering.
 

Doug S

Platinum Member
Feb 8, 2020
2,252
3,483
136
Apple may point the way mainstream PC offerings will eventually go, but you only need a small range of "traditional" offerings to cover the enthusiast market. If Intel stopped making socketed i3 and i5 would anyone here care? The question is, when does the size of the market of people who build (or buy and later upgrade) their PCs became too small for Intel/AMD to bother with?

Back in the day building your own PC saved you a ton of money over buying a prebuilt system. There were no white label Chinese PCs to force Dell et al to reduce their profit margins. CPU performance increased so fast that you could upgrade the CPU a year or two later and get a 50% performance bump. Memory was so tight that adding more later (when the price got more reasonable) could be a huge performance boost on its own. Hard drives were a lot smaller, so they filled up a lot more quickly especially as file sizes grew when first you start collecting MP3s, then videos.

When that DIY price advantage went away that really cut the number of people who wanted to build their own, but it was still important for those who wanted something prebuilt PCs didn't address. i.e. you can select a motherboard with the features you want, then choose appropriate CPU, cooler, power supply and case fans if you want to build a near silent system, or were going to overclock as high as possible.

So long as the size of the market for people who want to DIY remains large enough, it will be addressed. The problem is most of the "enthusiast" market is gamers, and mobile gaming revenue is larger than PC and console gaming combined these days. How many gamers who used to build their own system are buying a prebuilt PC they never upgrade? I'll bet a lot of them. How many former PC gamers are now console and/or mobile gamers? That's a much more reasonable option than it used to be, since so many games are limited by what a console can do and being faster gives only marginal improvement. There needs to be a sufficient number of Gen Z coming up who want to build their own systems, or that option will become so niche it isn't worth it for the market to continue serving.

Now I'm sure someone is reading this and saying "I build my own system and I never game, I do it because I want the best possible performance for x so your claim about gamers is bunk". Sure, that's fine. How many others like you are there? Not a "I see people like me on forums like Anandtech all the time" but how many people like you do you know in the real world - and I mean people who you met for a reason totally unrelated to PCs or your 'x' hobby that keeps you building your own PCs?