Discussion Quo vadis Apple Macs - Intel, AMD and/or ARM CPUs? ARM it is!

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Doug S

Member
Feb 8, 2020
143
172
76
You complain about "rapidly aging" with NVidia and then talk about customized console designs in the same sentence? You realize that console design likely won't change from the launch of the PS5 to its last sale, other than following TSMC's process shrinks to reduce power use and cost, right?

The one time effort to do a customized console design for the PS5, which may sell 50 or 100 million units over its lifetime of 6-9 years, is very different from a yearly or every other year effort to do a customized GPU for Apple for the Mac Pro / iMac Pro which may sell a million or so units a year. At least Apple's price point for the Mac Pro is a lot higher than the PS5 so they can afford to pay AMD more per unit, but still...
 

blckgrffn

Diamond Member
May 1, 2003
6,829
126
106
www.teamjuchems.com
You complain about "rapidly aging" with NVidia and then talk about customized console designs in the same sentence? You realize that console design likely won't change from the launch of the PS5 to its last sale, other than following TSMC's process shrinks to reduce power use and cost, right?

The one time effort to do a customized console design for the PS5, which may sell 50 or 100 million units over its lifetime of 6-9 years, is very different from a yearly or every other year effort to do a customized GPU for Apple for the Mac Pro / iMac Pro which may sell a million or so units a year. At least Apple's price point for the Mac Pro is a lot higher than the PS5 so they can afford to pay AMD more per unit, but still...
I think it probably has more to do with Nintendo looking to do a certain form factor, make money on each unit sold from the outset, etc. One could argue they used an aged design from the outset in the Switch. Then they had the gal to run it at low CPU clockspeeds. But that has been their way for the last several generations. I've often wondered what would have been if Nintendo had used sane branding for the WiiU (like Wii 2, for example) and used a Bobcat based APU that had at least been rumored at the time. It would have been a much more capable mid-generational unit and been able to handle One & PS4 ports much more capably. ¯\_(ツ)_/¯

Sony & MS at least believe they have to compete in a specifications war to gather their sales. It would be interesting if Sony were to take a more conservative hardware approach and simply rely on their exclusives as much as Nintendo has. Would that be enough for mass defection to the other console? I guess they must at least believe that this is true.

I think it's really likely Sony and MS push to shorter console generation lifetimes (to keep knives at each others throats) maybe as short as two/three years between refreshes - but maintain backward compatibility for two years or whatever as MS has promised to keep the OG One and S model one relevant for two more years. PS4 to PS4 Pro/Slim was 3 years.

I think your point is really legit though. I mean, the current AMD cards in Mac Pros seem to be all about giant amounts of high bandwidth memory - the actual GPUs being powerful but not cutting edge. If they stay with that, they could really space out refreshes or launch the next Mac Pro with the same designs they are using and have validated now. I guess then it comes down to how long AMD is willing to make them.

Not sure this adds anything to discussion but I spent too long writing this...
 

jpiniero

Diamond Member
Oct 1, 2010
7,789
1,114
126
AMD have provided highly customised designs for their console partners - as Apple, why bother to make such a massive investment to create a whole new high end GPU division when they have a partner doing all that custom work for them?
You could say that about the CPUs.

Apple uses dGPUs mostly for the compute power and not actual gaming performance. I certainly wouldn't underestimate Apple's ability to get a reasonable chiplet GPU strategy working, and that might be good enough for their needs. They can always supplement that with accelerators on the Mac Pro, from AMD or whoever.
 

soresu

Golden Member
Dec 19, 2014
1,179
443
136
I certainly wouldn't underestimate Apple's ability to get a reasonable chiplet GPU strategy working, and that might be good enough for their needs.
To underestimate something you first need a measurement to make an initial estimation, for which there is nothing at all currently.

This is simply pessimism - going with chiplets and high end at the same time for GPU is a huge jump from their current mobile semi custom offerings.

The fact that they actually re licensed with IMG Tec implies to me that they will continue to do PowerVR semi custom for mobile/tablet SoC's, and likewise continue a semi custom relationship with AMD for the high end.

The fact of the matter is that the legal side of the GPU design world is less about ISA licenses and more about specific feature patents from what I have gathered over the years - patents which have largely been filed by AMD, nVidia, (Intel?) or one of the current mobile GPU designers.

The reality of this situation is what truly confuses me about Apple's strange devaluing behaviour towards IMG Tec - why not buy it to achieve patent protection? Especially after they did such a terrific job of devaluing that poor company that it was easy pray to a Chinese backed business interest.
 
  • Like
Reactions: moinmoin and Tlh97

name99

Member
Sep 11, 2010
159
144
116
AMD have provided highly customised designs for their console partners - as Apple, why bother to make such a massive investment to create a whole new high end GPU division when they have a partner doing all that custom work for them?

Personally I would not be inclined to go off on a tangent with a new design, patents or not - AMD are just getting started with RDNA2 IMHO, given their design strategies are aligning to Zen and passing learning from one project back to the other.
Which part of "Apple Silicon" did you not understand?

Apple isn't going to spend the entire WWDC constantly pushing the idea that this is APPLE SILICON (and in every talk, mentioning specific details like how to optimize for tile rendering, or the advantages of a single pool of RAM) only to then say "just kidding, what we meant was Apple+AMD silicon"...

Honestly, this is nothing but a replay of the CPU argument. Remember how that went
"Apple is on track to exceed Intel performance".
"No they aren't"
"Why not? Give me technical reasons."
"Well Intel has experience. And fabs. And x86 magic pixie dust."

Now we are hearing the same arguments based on anything but actual technology nous regarding the GPU, and soon we're hear yet a third version of it based on how Apple might have a good core but they will not be able to scale it up to a Xeon number of cores because reasons.


(a) Apple does things on a long timetable. Apple is planning for ten years from now. If your worldview extends to three days from now, yes you won't understand what they are doing.

(b) Apple operates on the assumption (largely validated...) that they can do many thing better than anyone else.
Part of it is that they have total control of the hardware, the OS, the compilers, the APIs, so they are willing to add functionality to their CPU/GPU/NPU/... that others wouldn't add because "no existing apps will use it, and no-one will pay for it". Look at them adding the U1 to iPhones. Been sitting there mostly unused for a year; but it's part of that long term plan.
Another part of this is that Apple is willing to drop any parts of the past that hold them back or cause problems. And, sure, that creates a little hassle every year. But the payoff is immense, for Apple, developers, and users. Other companies will not accept that short term pain for the long term gain.

So, by Apple having their own GPU, they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago, or that's there so AMD can sell the GPU as an NPU; while replacing it with eg functionality that accelerates AR (ray tracing?, physics engines?)

This is not about "Apple can save money by not having to pay AMD (or Intel)". It's about Apple wants a rocket to the moon, while AMD and Intel think a daring step forward is design a boat to cross the English Channel.
 

Doug S

Member
Feb 8, 2020
143
172
76
The reality of this situation is what truly confuses me about Apple's strange devaluing behaviour towards IMG Tec - why not buy it to achieve patent protection? Especially after they did such a terrific job of devaluing that poor company that it was easy pray to a Chinese backed business interest.
The fact Apple could have bought them, and ended up taking a license (for what we don't really know) means that the price they are paying for that license (added up for however many years they intended to do so) is a lot less than the purchase price would have been.

They only "devalued that poor company" because the majority of Imagination's business was dependent on Apple. If they had been more successful they would have had more customers and wouldn't have been affected by Apple leaving. Look at Intel, Apple is a single digit percentage of their overall business so they will barely notice Apple's absence. It didn't affect Samsung much at all when Apple switched to TSMC's foundry.

Little companies that have all/most their eggs in the basket of one customer are always hanging by a thread and at the mercy of that customer, that's the nature of the relationship when you are a small fry selling to a huge multinational. If you have a big steak restaurant that buys meat from a local butcher then decide to take your business elsewhere that little butcher shop will lose a lot of revenue and might go under. Do you think you should have an obligation to keep going to him forever to help him survive? Apple didn't deliberately devalue Imagination any more than you as the owner of that big steak restaurant would deliberately make the butcher shop go under.

There are thousands of small companies who would see their yearly sales double or even go up by 10x if Apple became their customer. They would equally be at risk if Apple later stopped being their customer, but most of them would still take the deal. It is their choice whether they enter into a deal with Apple, and they know that Apple can switch suppliers or insource what you are selling them without warning (beyond contractual guarantees of notice)
 
  • Like
Reactions: beginner99

uzzi38

Senior member
Oct 16, 2019
810
954
96
Which part of "Apple Silicon" did you not understand?

Apple isn't going to spend the entire WWDC constantly pushing the idea that this is APPLE SILICON (and in every talk, mentioning specific details like how to optimize for tile rendering, or the advantages of a single pool of RAM) only to then say "just kidding, what we meant was Apple+AMD silicon"...

Honestly, this is nothing but a replay of the CPU argument. Remember how that went
"Apple is on track to exceed Intel performance".
"No they aren't"
"Why not? Give me technical reasons."
"Well Intel has experience. And fabs. And x86 magic pixie dust."

Now we are hearing the same arguments based on anything but actual technology nous regarding the GPU, and soon we're hear yet a third version of it based on how Apple might have a good core but they will not be able to scale it up to a Xeon number of cores because reasons.


(a) Apple does things on a long timetable. Apple is planning for ten years from now. If your worldview extends to three days from now, yes you won't understand what they are doing.

(b) Apple operates on the assumption (largely validated...) that they can do many thing better than anyone else.
Part of it is that they have total control of the hardware, the OS, the compilers, the APIs, so they are willing to add functionality to their CPU/GPU/NPU/... that others wouldn't add because "no existing apps will use it, and no-one will pay for it". Look at them adding the U1 to iPhones. Been sitting there mostly unused for a year; but it's part of that long term plan.
Another part of this is that Apple is willing to drop any parts of the past that hold them back or cause problems. And, sure, that creates a little hassle every year. But the payoff is immense, for Apple, developers, and users. Other companies will not accept that short term pain for the long term gain.

So, by Apple having their own GPU, they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago, or that's there so AMD can sell the GPU as an NPU; while replacing it with eg functionality that accelerates AR (ray tracing?, physics engines?)

This is not about "Apple can save money by not having to pay AMD (or Intel)". It's about Apple wants a rocket to the moon, while AMD and Intel think a daring step forward is design a boat to cross the English Channel.
Apple are using their 'own' GPU IP (I mean, it's heavily tweaked Imagination IP but you get the idea) for the CPUs they use, but for the immediate future they aren't replacing Radeon for dGPUs. Yet.

That could change in 3-4 years, but before then no.
 
  • Like
Reactions: Tlh97 and soresu

soresu

Golden Member
Dec 19, 2014
1,179
443
136
So, by Apple having their own GPU, they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago, or that's there so AMD can sell the GPU as an NPU; while replacing it with eg functionality that accelerates AR (ray tracing?, physics engines?)
You must have missed the point of the CDNA announcement earlier this year.

That architecture divergence essentially creates a non graphics focused compute accelerator.

As for "they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago" - the fact that Vulkan can be translated to Metal (MoltenVK) and DX9/10/11/12 to Vulkan (DXVK and VKD3D) implies that Apple favors supporting exactly the same functionality.

The only difference is that they are doing it in their own particular (and annoying to non Apple exclusive devs) proprietary and controllable platform, so that they can deprecate all OpenGL support from their platform going forward and lose the headaches associated with that support infrastructure.

It should be noted that despite a short bust up early on, MoltenVK has not been blocked by Apple - they are happy to let devs use Vulkan to code for their OS's so long as that code is executed through the Metal API that they control.

Metal is consistent across iOS, iPadOS and Mac OS - they are not magically moving away from mainstream GPU feature set support here, just former mainstream and open API platforms that offer that feature set.

Deprecating those features completely would probably mean certain software ceasing to work, Rosetta or not - including the big apps from DCC giants Adobe and Autodesk.

Likewise Vulkan is already evolving to support ray tracing - no doubt the next increment of Metal will support it also in a similar fashion, likely running on a derivative of the PowerVR Wizard (nee Caustic Labs) traversal/intersection hardware, at least on mobile.

As far as Apple having the R&D capabilities to create a capable GPU, I wouldn't at all say it is impossible - I only said it was unlikely to tackle high end and chiplets at the same time.

I also stated that patents are a major stumbling block to Apple Silicon TM branded GPU's, unless they buy out a major GPU vendor that is.

The fact of the matter is that AMD is doing well enough now that beating them would be no simple matter - therefore any homegrown competitor would have to have a good 25 to 50% advantage for a given node to be worth the investment.

Plus AMD already provides such a strong custom silicon design infrastructure that it is as good as vertically integrated if the customer wants it to be - this fact is borne out by strength of the Sony/MS console silicon relationships with AMD over the last decade.
 
Last edited:
  • Like
Reactions: Tlh97

soresu

Golden Member
Dec 19, 2014
1,179
443
136
This is not about "Apple can save money by not having to pay AMD (or Intel)". It's about Apple wants a rocket to the moon, while AMD and Intel think a daring step forward is design a boat to cross the English Channel.
Also, wuh?

Again - patents, patents, patents.

It was always going to come back to this, it doesn't matter what ambitions they have - patents will always be a giant stumbling block to having their own GPU IP free from another companies fingerprints.

It's why Intel licenses from nVidia, and why Apple have gone back to IMG Tec.

While we're on the subject of IMG Tec, let's just linger there a little on your chosen pedestal of heavenly IT glory, and their particular actions concerning this relatively tiny company.

Here is Apple. A truly giant, trillion dollar valued corporation sweating pennies in royalties paid to a tiny company that they could buy and sell a hundred times over considering their yearly profits - sweating so bad that they purposely devalued it to the point that it had to be bailed out by a Chinese backed venture capital fund, and only then renewed their IP licensing when the company leadership was likely desperate to make a deal, hoping to get IMG Tec's RT acceleration IP for dirt cheap during the new RT gold rush.

Oh man, and here I was thinking that nVidia and Intel had the pole position on crappy business practices in the IT world - Apple sure took the cake here, with extra rot infused frosting on the side.

I can see why you are so hung up on it, from my perspective the sole truly innovative thing that Apple have done in the last 10 years was that ARM core, so I suppose it is bound to stick out.

Everything else was just following in the wake of others with a flashier, more expensive, and usually less impressive (ie cheaper BOM) offering, and often after brushing off the attraction of said features for months to years, before adopting them for their own products and declaring it to be the best thing since sliced bread.

So yeah, real rocket to the moon stuff right there, hold me while I shake in awe at their Einsteinian leaps in ingenuity. /s
 

jpiniero

Diamond Member
Oct 1, 2010
7,789
1,114
126
I also stated that patents are a major stumbling block to Apple Silicon TM branded GPU's, unless they buy out a major GPU vendor that is.
They don't need dGPUs if they make the IGP powerful enough to fit their needs. Note that their needs don't mean the absolute highest frame rates.
 

soresu

Golden Member
Dec 19, 2014
1,179
443
136
They don't need dGPUs if they make the IGP powerful enough to fit their needs. Note that their needs don't mean the absolute highest frame rates.
The pursuit of an exclusive dual GPU SKU for the recent Mac Pro shows frame rates are not, and have never really been on their minds, excepting perhaps video editing in 4K, 6K, 8K+ resolutions.

Supposedly Otoy are working on a Metal version of Octane Render that will take full advantage of those insane 2x2 GPU Mac Pro's, albeit it has yet to materialise, much like the previously promised but cancelled Vulkan version, and the very short lived OpenCL version (if you can't read my fatalism here you aren't looking very hard).

In short, if they pursue high end dGPU it's for pro/workstation work, though that was always the point of Mac Pro anyway.
 
Last edited:

dmens

Golden Member
Mar 18, 2005
1,933
208
106
and why Apple have gone back to IMG Tec.

While we're on the subject of IMG Tec, let's just linger there a little on your chosen pedestal of heavenly IT glory, and their particular actions concerning this relatively tiny company.

Here is Apple. A truly giant, trillion dollar valued corporation sweating pennies in royalties paid to a tiny company that they could buy and sell a hundred times over considering their yearly profits - sweating so bad that they purposely devalued it to the point that it had to be bailed out by a Chinese backed venture capital fund, and only then renewed their IP licensing when the company leadership was likely desperate to make a deal, hoping to get IMG Tec's RT acceleration IP for dirt cheap during the new RT gold rush.
LOL. You have no idea what you talking about.
 

Tup3x

Senior member
Dec 31, 2016
311
128
86
I wonder if Apple is planning to make docking solutions for their phones and tablets in the future... They'd need to combine iOS and MacOS but the hardware isn't a problem anymore. Phone would have more than enough power to handle basic office tasks and web surfing.
 

jpiniero

Diamond Member
Oct 1, 2010
7,789
1,114
126
I wonder if Apple is planning to make docking solutions for their phones and tablets in the future... They'd need to combine iOS and MacOS but the hardware isn't a problem anymore. Phone would have more than enough power to handle basic office tasks and web surfing.
Apple of course wants people to buy iPhones AND an iPad AND a MBP. That alone makes it in their best interest to not do docking. But they will do just about everything else, including encouraging developers to make apps work on both interfaces.
 

soresu

Golden Member
Dec 19, 2014
1,179
443
136
Last edited:
  • Like
Reactions: lightmanek

Eug

Lifer
Mar 11, 2000
22,698
301
126
While this patent means that such a product is possible, it is not necessarily likely to be brought.

Even major Android vendors haven't taken this angle seriously because it is a clunky solution at best - and Apple are nothing if not style first, this just feels too clunky for something that Apple would go for in practice.
I don’t think Apple will do it with the iPhone either but they have already effectively done it with the iPad. It’s not macOS but a mature iPadOS with Magic Trackpad that is basically a docking system with keyboard and trackpad. The iPad just clicks in by magnets and becomes a screen, or you can just lift it off for it to become a bare tablet again. Even the entry level iPad 7 has a full-size keyboard solution now, that you can click on and off at will.

I have an iPad Pro 10.5” (2017) with Apple Smart Keyboard (no trackpad) and a 2017 MacBook 12” and I use the iPad Pro 95% of the time these days. However, that’s partially because I have an iMac at home so I have no use for a MacBook at home.
 

soresu

Golden Member
Dec 19, 2014
1,179
443
136
I don’t think Apple will do it with the iPhone either but they have already effectively done it with the iPad. It’s not macOS but a mature iPadOS with Magic Trackpad that is basically a docking system with keyboard and trackpad. The iPad just clicks in by magnets and becomes a screen, or you can just lift it off for it to become a bare tablet again. Even the entry level iPad 7 has a full-size keyboard solution now, that you can click on and off at will.
Ah that's different, the keyboard/trackpad being just a bare accessory (with an admittedly nice dock method) rather than a full blown dock as in the patent, or those lapdock things for Android phones.
 

Tup3x

Senior member
Dec 31, 2016
311
128
86
Ah that's different, the keyboard/trackpad being just a bare accessory (with an admittedly nice dock method) rather than a full blown dock as in the patent, or those lapdock things for Android phones.
The real reason is the software. Android software just aren't meant for mouse use (my P30 has desktop mode but there's really not much point in using that mode). If Apple would provide full MacOS experience when connected to separate display with keyboard and mouse (or some kind of laptop dock), it would actually be useful.

I must admit that it's my dream that some day my phone would have enough power so that I could just keep a dock at work and at home. I don't like commuting with laptop in my backpack...

Maybe some day... :pensive:
 

soresu

Golden Member
Dec 19, 2014
1,179
443
136
I must admit that it's my dream that some day my phone would have enough power so that I could just keep a dock at work and at home. I don't like commuting with laptop in my backpack...
You can do that already for a lot of work with even something relatively old like A72.

The actual CPU cores in flagship phones are more than capable of doing much of the work people do today given an unrestrained power supply through docking - albeit maybe thermally not the greatest idea to have them running full tilt for great lengths of time without a larger heatsink.

This is why I am patiently waiting for an RK3588 or Amlogic S908X board, though COVID seems to be slowed their release somewhat.
 

Doug S

Member
Feb 8, 2020
143
172
76
I wonder if Apple is planning to make docking solutions for their phones and tablets in the future... They'd need to combine iOS and MacOS but the hardware isn't a problem anymore. Phone would have more than enough power to handle basic office tasks and web surfing.
I think its possible but only they can run the numbers internally to see if they work. It would obviously cost some Mac sales but it would likely greatly increase the macOS userbase even if there were fewer actual Macs sold. I've suggested Apple might do something like this for years, even before Continuum and whatever Samsung calls their version of it came out. It is a pretty obvious idea once phone SoCs are fast enough which they have been for a while now, but really needs macOS to run on ARM64 first so it hasn't really been feasible before.

They wouldn't need to combine iOS and macOS, they just need to make an macOS "app" can run that runs the full macOS GUI and has all the libraries and APIs available to it so every ARM Mac application can run on it. Have a little breakout Lightning cable that includes an HDMI port and a few USB ports for keyboard/mouse and a USB stick for when you need old fashioned 'sneakernet' data sharing. The cable would come with a personal/educational license to download/run the macOS app on your iPhone, paying extra for a commercial license to run in a business setting.

This would be great for a lot of light PC users. Use the TV you already have as a monitor then a keyboard/mouse would be all you need turn your phone into a full fledged desktop computer able to run software that doesn't translate well to a small screen like word processing, spreadsheets, tax prep, etc. Most people already have a keyboard/mouse lying around from an old PC so other than the cost of the dongle it would basically be free for personal use.

Think of all the college students who have an iPhone but don't have a Mac who would become Mac users and potential customers in the future. Imagine you're taking a business trip, and instead of bringing a laptop bag you only need to bring your Lightning dongle because you can rely on being able to check out a keyboard/mouse from the hotel's business center and hook it up to your room's TV. When you visit an office the next day they have hot desks with monitor/keyboard/mouse you can connect to.

Microsoft could have really benefitted from this if they hadn't bungled Windows Phone and Intel hadn't bungled mobile x86. Microsoft doesn't have any PC business to cut into like Apple does, so it would be a no-brainer slam dunk for them. I guess that's probably why they introduced Continuum, they'd probably been working on it for years assuming Windows Phone running on x86 SoCs would be around for it when it was complete, and didn't want to waste all that effort.

The big question is whether the numbers work for Apple since unlike Microsoft they would be cutting their own throat to some extent. Obviously it reduces Mac sales among current macOS users as some will find this capability good enough for their needs. Will the growth in the macOS platform from the infusion of all the new users result in enough future sales to make it worth it? Or does it make the Mac even more of a niche product, and go from 5% to 2% of the PC market long term?
 

ASK THE COMMUNITY