Discussion Quo vadis Apple Macs - Intel, AMD and/or ARM CPUs? ARM it is!

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moinmoin

Diamond Member
Jun 1, 2017
5,064
8,032
136
Due to popular demand I thought somebody should start a proper thread on this pervasive topic. So why not do it myself? ;)

For nearly a decade now Apple has treated their line of Mac laptops, AIOs and Pro workstations more of a stepchild. Their iOS line of products have surpassed it in market size and profit. Their dedicated Mac hardware group was dissolved. Hardware and software updates has been lackluster.

But for Intel Apple clearly is still a major customer, still offering custom chips not to be had outside of Apple products. Clearly Intel is eager to at all costs keep Apple as a major showcase customer.

On the high end of performance Apple's few efforts to create technological impressive products using Intel parts increasingly fall flat. The 3rd gen of MacPros going up to 28 cores could have wowed the audience in earlier years, but when launched in 2019 it already faced 32 core Threadripper/Epyc parts, with 64 core updates of them already on the horizon. A similar fate appears to be coming for the laptops as well, with Ryzen Mobile 4000 besting comparable Intel solutions across the board, with run of the mill OEMs bound to surpass Apple products in battery life. A switch to AMD shouldn't even be a big step considering Apple already has a close work relationship with them, sourcing custom GPUs from them like they do with CPUs from Intel.

On the low end Apple is pushing iPadOS into becoming a workable mutitasking system, with decent keyboard and, most recently, mouse support. Considering the much bigger audience familiar with the iOS mobile interface and App Store, it may make sense to eventually offer a laptop form factor using the already tweaked iPadOS.

By the look of all things Apple Mac products are due to continue stagnating. But just like for Intel, the status quo for Mac products feels increasingly untenable.
 
  • Like
Reactions: Vattila

Eug

Lifer
Mar 11, 2000
23,807
1,385
126
P.S. Star Trek: Discovery is edited on a 2013 Mac Pro trash can. That's a 7 year-old machine. What's even more surprising though is that the "desk" the computer sits on during Covid is a plastic foldable portable table. :p


I'm not sure which CPU he has in the Mac Pro, but the iPad Pro A14X SoC coming within the year will likely be almost as fast for CPU speed as a mid-tier 8-core Mac Pro of that generation. Only the 12-core model would still be significantly faster.

Obviously I'm not recommending a broadcast TV video editor buying a new computer today for work should be buying such a computer but it does illustrate the performance improvements Apple has made in its Arm processors during this period, and how Apple is more than prepared for the transition.

Apple simply does not need boutique chips for most MacBook Pros and iMacs. Their existing chip categories are already good enough for most pro users. It's really just for the really high end does Apple need faster chips, but I think it's pretty safe to say Apple can make those too if it decides it wants to. They will be comparatively more costly for Apple to make, but that's OK, because those are more expensive machines anyway. Instead of paying Intel for high end i7, i9, and Xeon chips, it can keep everything in-house (aside from paying TSMC for fab services).
 

jpiniero

Lifer
Oct 1, 2010
15,168
5,698
136
FWIW, it does sound like Apple is also ditching Radeon dGPUs too, based upon what they are telling developers. This is not definitive but does make sense with them going to their own CPU.
 
  • Like
Reactions: Etain05

soresu

Diamond Member
Dec 19, 2014
3,208
2,480
136
FWIW, it does sound like Apple is also ditching Radeon dGPUs too, based upon what they are telling developers. This is not definitive but does make sense with them going to their own CPU.
It's a huge jump from scaling up a pre existing and proven CPU core to introducing a dGPU at the level they scale up to on the current Mac Pro (2x Vega II Duo's, or 4x Vega II GPU's).

The existing GPU core in the Axx SoC is basically a stripped down and souped up PowerVR design that hasn't been used for any truly significant compute work as no such application was available on iOS or its derivatives.

I have no doubt that they will certainly try for a custom GPU, but this is a fraught minefield for them with all the GPU patents floating about - the fact that they signed back up to PowerVR is proof apparent that they could not simply make a GPU that fit outside of the box that existing patents and compute needs constrain it too.

OTOH they have tremendous leeway with AMD semi custom designs, and currently they have really not pushed that beyond some basic high efficiency HBM SKU's and the Vega II Duo's - they may well just go for a more extreme custom design, after all licensing from Imagination does not prevent them from having AMD co designs too, ala Sony/MS console GPU's.

Especially if AMD does indeed have chiplet GPU coming and has disclosed these plans to semi custom partners.
 
  • Like
Reactions: Tlh97

Doug S

Platinum Member
Feb 8, 2020
2,742
4,667
136
If Apple really is going to use their own GPU design for the Mac Pro they must already have prototyped some high end GPUs and determined the performance is competitive. There's no way they'd do it if they weren't confident of that.

It still isn't clear what the new license with Imagination was about, but I have to think it is simply patent protection. After making a big deal about designing their own GPUs they wouldn't go back to using someone else's designs unless they hit a pretty big roadblock and were truly desperate. Imagination doesn't have experience with designs at the high end so they aren't likely to be of much help with the Mac Pro GPU.

Perhaps it is possible they might license the Mac Pro GPU from AMD, at least for the first gen, if they aren't able to compete at that level yet. But I question whether AMD would be very amenable to that. Today AMD has all of Apple's discrete GPU business, if they help Apple they would be helping Apple become their former customer. If they know Apple has a missing piece of "what to do about the Mac Pro" and they don't help them, maybe it takes Apple another year or two before they are ready to switch. On the other hand, helping Apple leave x86 hurts Intel so maybe it is an "enemy of my enemy" situation lol
 
  • Like
Reactions: Tlh97

mikegg

Golden Member
Jan 30, 2010
1,835
459
136
If Apple really is going to use their own GPU design for the Mac Pro they must already have prototyped some high end GPUs and determined the performance is competitive. There's no way they'd do it if they weren't confident of that.

It still isn't clear what the new license with Imagination was about, but I have to think it is simply patent protection. After making a big deal about designing their own GPUs they wouldn't go back to using someone else's designs unless they hit a pretty big roadblock and were truly desperate. Imagination doesn't have experience with designs at the high end so they aren't likely to be of much help with the Mac Pro GPU.

Perhaps it is possible they might license the Mac Pro GPU from AMD, at least for the first gen, if they aren't able to compete at that level yet. But I question whether AMD would be very amenable to that. Today AMD has all of Apple's discrete GPU business, if they help Apple they would be helping Apple become their former customer. If they know Apple has a missing piece of "what to do about the Mac Pro" and they don't help them, maybe it takes Apple another year or two before they are ready to switch. On the other hand, helping Apple leave x86 hurts Intel so maybe it is an "enemy of my enemy" situation lol
I think Apple will use its own GPUs for Mac laptops but will continue to offer AMD as a choice for iMacs and Mac Pros.

There's no way Apple can whip out a dedicated GPU that can compete with the best of Nvidia and AMD in the next 5 years.
 

Richie Rich

Senior member
Jul 28, 2019
470
229
76
I think Apple will use its own GPUs for Mac laptops but will continue to offer AMD as a choice for iMacs and Mac Pros.

There's no way Apple can whip out a dedicated GPU that can compete with the best of Nvidia and AMD in the next 5 years.
Apple surely can make dedicated desktop GPU if they want. Their SOC GPU are brutaly efficient. Apple GPU has 4x more FPS per Watt compared to NVIDIA and Radeons, just look at the measurements:


GPU in A12X is literaly beating older desktop class Radeons 6000 and 7000 series:

There is nothing to hold them back to put more CUs. GPU scales pretty well once you have good architecture. And Apple has best mobile architecture on the market. The same big bang what happens with their ARM CPU can easily happen in GPU too. Tough mobile environment forced them to develop much efficient and better architecture (valid for both CPUs and GPUs). Lazy desktop chipmakers like Nvidia and AMD will eat sour fruits of their own laziness. That's simple.

ML and AI? Apple has again best in class NPU for that. NVIDIA was so bad in ML/AI that Tesla dump Nvidia and raher created their own silicon (with help of Jim Keller). AMD is much worse than NV in ML. Question is if GPU can be as good as special NPU, I really doubt.

ARM MacBook is game changer. Always on function thanks to Little cores like smartphone. Face recognition for unlock-screen, speech to text thanks to NPU. Current desktops are last century technology on stereoids. Somehow still powerfull but really outdated like dinosaurs.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,027
11,607
136
Apple: We'll transition the Mac to ARM.
Forum: Apple is making server chips and discrete GPUs.

I am genuinely amazed we don't have a dedicated Apple console thread by now.

I agree with where you're going with this. There's no real indicator that Apple is trying to replace every class of Intel chip they've used in Mac products in the past (much less dGPUs). They may be abandoning certain market segments, or at least changing how they serve them.
 

mikegg

Golden Member
Jan 30, 2010
1,835
459
136
ML and AI? Apple has again best in class NPU for that. NVIDIA was so bad in ML/AI that Tesla dump Nvidia and raher created their own silicon (with help of Jim Keller). AMD is much worse than NV in ML. Question is if GPU can be as good as special NPU, I really doubt.
Relax. Nvidia is the standard in A.I. acceleration. There is a huge difference between tiny inference ML/AI acceleration on a phone versus the Nvidia class A.I. hardware acceleration and industry-standard APIs.

Inferring a face recognition machine learning algorithm on an iPhone is different than training a neural network with petabytes of data.

I guarantee you Apple employees are using Nvidia cards to do internal A.I. work.

You can be bullish on Apple's SoC designing capabilities, but you don't have to be foolish.
 
Last edited:
  • Like
Reactions: Glo. and beginner99

mikegg

Golden Member
Jan 30, 2010
1,835
459
136
I agree with where you're going with this. There's no real indicator that Apple is trying to replace every class of Intel chip they've used in Mac products in the past (much less dGPUs). They may be abandoning certain market segments, or at least changing how they serve them.
The most important thing to Apple is a unified SoC architecture so iOS, iPadOS, and MacOs can share applications.

The second is performance.

Apple won't be able to match everything that AMD, Nvidia, and Intel can provide right now to start with. But they're ok with that because their main goal is to unify their OSs.
 

blckgrffn

Diamond Member
May 1, 2003
9,293
3,435
136
www.teamjuchems.com
Apple: We'll transition the Mac to ARM.
Forum: Apple is making server chips and discrete GPUs.

I am genuinely amazed we don't have a dedicated Apple console thread by now.

I think the last couple ATV launches must have been under your radar. Did you miss how those are game consoles?

And how they will leverage the App Store to bring in devs and immediately have a huge impact on the market?

I don't know about here because I was on a hiatus, but a good part of the Internet seemed really enthused about this possibility.

Just think of it, next gen consoles come out and Apple is there an ready too, with two years of free Apple TV and an iPad SoC in a box (Apple TV 8k!) that costs just as much as the PS5 and the Xbox SX... how could we say no?!? :p
 

soresu

Diamond Member
Dec 19, 2014
3,208
2,480
136
Perhaps it is possible they might license the Mac Pro GPU from AMD, at least for the first gen, if they aren't able to compete at that level yet. But I question whether AMD would be very amenable to that. Today AMD has all of Apple's discrete GPU business, if they help Apple they would be helping Apple become their former customer.
Licensing is not necessary.

The relationships AMD have with Sony and MS show that a semi custom arrangement with can be far more than just sticking some HBM on a custom SKU of their off the shelf uArch's.

Unlike nVidia who are essentially providing a pre designed (and rapidly aging) chip with TX1 for Switch, AMD have provided highly customised designs for their console partners - as Apple, why bother to make such a massive investment to create a whole new high end GPU division when they have a partner doing all that custom work for them?

Personally I would not be inclined to go off on a tangent with a new design, patents or not - AMD are just getting started with RDNA2 IMHO, given their design strategies are aligning to Zen and passing learning from one project back to the other.

nVidia are probably thinking of Zen2 and the possible future of RDNA when designing for Hopper - while AMD are already evolving the physical chiplet architecture going from Zen3 to Zen4 and beyond, I don't expect the RDNA chiplet evolution to look like Zen2 at all.
 

Doug S

Platinum Member
Feb 8, 2020
2,742
4,667
136
You complain about "rapidly aging" with NVidia and then talk about customized console designs in the same sentence? You realize that console design likely won't change from the launch of the PS5 to its last sale, other than following TSMC's process shrinks to reduce power use and cost, right?

The one time effort to do a customized console design for the PS5, which may sell 50 or 100 million units over its lifetime of 6-9 years, is very different from a yearly or every other year effort to do a customized GPU for Apple for the Mac Pro / iMac Pro which may sell a million or so units a year. At least Apple's price point for the Mac Pro is a lot higher than the PS5 so they can afford to pay AMD more per unit, but still...
 

blckgrffn

Diamond Member
May 1, 2003
9,293
3,435
136
www.teamjuchems.com
You complain about "rapidly aging" with NVidia and then talk about customized console designs in the same sentence? You realize that console design likely won't change from the launch of the PS5 to its last sale, other than following TSMC's process shrinks to reduce power use and cost, right?

The one time effort to do a customized console design for the PS5, which may sell 50 or 100 million units over its lifetime of 6-9 years, is very different from a yearly or every other year effort to do a customized GPU for Apple for the Mac Pro / iMac Pro which may sell a million or so units a year. At least Apple's price point for the Mac Pro is a lot higher than the PS5 so they can afford to pay AMD more per unit, but still...

I think it probably has more to do with Nintendo looking to do a certain form factor, make money on each unit sold from the outset, etc. One could argue they used an aged design from the outset in the Switch. Then they had the gal to run it at low CPU clockspeeds. But that has been their way for the last several generations. I've often wondered what would have been if Nintendo had used sane branding for the WiiU (like Wii 2, for example) and used a Bobcat based APU that had at least been rumored at the time. It would have been a much more capable mid-generational unit and been able to handle One & PS4 ports much more capably. ¯\_(ツ)_/¯

Sony & MS at least believe they have to compete in a specifications war to gather their sales. It would be interesting if Sony were to take a more conservative hardware approach and simply rely on their exclusives as much as Nintendo has. Would that be enough for mass defection to the other console? I guess they must at least believe that this is true.

I think it's really likely Sony and MS push to shorter console generation lifetimes (to keep knives at each others throats) maybe as short as two/three years between refreshes - but maintain backward compatibility for two years or whatever as MS has promised to keep the OG One and S model one relevant for two more years. PS4 to PS4 Pro/Slim was 3 years.

I think your point is really legit though. I mean, the current AMD cards in Mac Pros seem to be all about giant amounts of high bandwidth memory - the actual GPUs being powerful but not cutting edge. If they stay with that, they could really space out refreshes or launch the next Mac Pro with the same designs they are using and have validated now. I guess then it comes down to how long AMD is willing to make them.

Not sure this adds anything to discussion but I spent too long writing this...
 

jpiniero

Lifer
Oct 1, 2010
15,168
5,698
136
AMD have provided highly customised designs for their console partners - as Apple, why bother to make such a massive investment to create a whole new high end GPU division when they have a partner doing all that custom work for them?

You could say that about the CPUs.

Apple uses dGPUs mostly for the compute power and not actual gaming performance. I certainly wouldn't underestimate Apple's ability to get a reasonable chiplet GPU strategy working, and that might be good enough for their needs. They can always supplement that with accelerators on the Mac Pro, from AMD or whoever.
 

soresu

Diamond Member
Dec 19, 2014
3,208
2,480
136
I certainly wouldn't underestimate Apple's ability to get a reasonable chiplet GPU strategy working, and that might be good enough for their needs.
To underestimate something you first need a measurement to make an initial estimation, for which there is nothing at all currently.

This is simply pessimism - going with chiplets and high end at the same time for GPU is a huge jump from their current mobile semi custom offerings.

The fact that they actually re licensed with IMG Tec implies to me that they will continue to do PowerVR semi custom for mobile/tablet SoC's, and likewise continue a semi custom relationship with AMD for the high end.

The fact of the matter is that the legal side of the GPU design world is less about ISA licenses and more about specific feature patents from what I have gathered over the years - patents which have largely been filed by AMD, nVidia, (Intel?) or one of the current mobile GPU designers.

The reality of this situation is what truly confuses me about Apple's strange devaluing behaviour towards IMG Tec - why not buy it to achieve patent protection? Especially after they did such a terrific job of devaluing that poor company that it was easy pray to a Chinese backed business interest.
 
  • Like
Reactions: moinmoin and Tlh97

name99

Senior member
Sep 11, 2010
496
382
136
AMD have provided highly customised designs for their console partners - as Apple, why bother to make such a massive investment to create a whole new high end GPU division when they have a partner doing all that custom work for them?

Personally I would not be inclined to go off on a tangent with a new design, patents or not - AMD are just getting started with RDNA2 IMHO, given their design strategies are aligning to Zen and passing learning from one project back to the other.

Which part of "Apple Silicon" did you not understand?

Apple isn't going to spend the entire WWDC constantly pushing the idea that this is APPLE SILICON (and in every talk, mentioning specific details like how to optimize for tile rendering, or the advantages of a single pool of RAM) only to then say "just kidding, what we meant was Apple+AMD silicon"...

Honestly, this is nothing but a replay of the CPU argument. Remember how that went
"Apple is on track to exceed Intel performance".
"No they aren't"
"Why not? Give me technical reasons."
"Well Intel has experience. And fabs. And x86 magic pixie dust."

Now we are hearing the same arguments based on anything but actual technology nous regarding the GPU, and soon we're hear yet a third version of it based on how Apple might have a good core but they will not be able to scale it up to a Xeon number of cores because reasons.


(a) Apple does things on a long timetable. Apple is planning for ten years from now. If your worldview extends to three days from now, yes you won't understand what they are doing.

(b) Apple operates on the assumption (largely validated...) that they can do many thing better than anyone else.
Part of it is that they have total control of the hardware, the OS, the compilers, the APIs, so they are willing to add functionality to their CPU/GPU/NPU/... that others wouldn't add because "no existing apps will use it, and no-one will pay for it". Look at them adding the U1 to iPhones. Been sitting there mostly unused for a year; but it's part of that long term plan.
Another part of this is that Apple is willing to drop any parts of the past that hold them back or cause problems. And, sure, that creates a little hassle every year. But the payoff is immense, for Apple, developers, and users. Other companies will not accept that short term pain for the long term gain.

So, by Apple having their own GPU, they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago, or that's there so AMD can sell the GPU as an NPU; while replacing it with eg functionality that accelerates AR (ray tracing?, physics engines?)

This is not about "Apple can save money by not having to pay AMD (or Intel)". It's about Apple wants a rocket to the moon, while AMD and Intel think a daring step forward is design a boat to cross the English Channel.
 

Doug S

Platinum Member
Feb 8, 2020
2,742
4,667
136
The reality of this situation is what truly confuses me about Apple's strange devaluing behaviour towards IMG Tec - why not buy it to achieve patent protection? Especially after they did such a terrific job of devaluing that poor company that it was easy pray to a Chinese backed business interest.

The fact Apple could have bought them, and ended up taking a license (for what we don't really know) means that the price they are paying for that license (added up for however many years they intended to do so) is a lot less than the purchase price would have been.

They only "devalued that poor company" because the majority of Imagination's business was dependent on Apple. If they had been more successful they would have had more customers and wouldn't have been affected by Apple leaving. Look at Intel, Apple is a single digit percentage of their overall business so they will barely notice Apple's absence. It didn't affect Samsung much at all when Apple switched to TSMC's foundry.

Little companies that have all/most their eggs in the basket of one customer are always hanging by a thread and at the mercy of that customer, that's the nature of the relationship when you are a small fry selling to a huge multinational. If you have a big steak restaurant that buys meat from a local butcher then decide to take your business elsewhere that little butcher shop will lose a lot of revenue and might go under. Do you think you should have an obligation to keep going to him forever to help him survive? Apple didn't deliberately devalue Imagination any more than you as the owner of that big steak restaurant would deliberately make the butcher shop go under.

There are thousands of small companies who would see their yearly sales double or even go up by 10x if Apple became their customer. They would equally be at risk if Apple later stopped being their customer, but most of them would still take the deal. It is their choice whether they enter into a deal with Apple, and they know that Apple can switch suppliers or insource what you are selling them without warning (beyond contractual guarantees of notice)
 
  • Like
Reactions: beginner99

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
Which part of "Apple Silicon" did you not understand?

Apple isn't going to spend the entire WWDC constantly pushing the idea that this is APPLE SILICON (and in every talk, mentioning specific details like how to optimize for tile rendering, or the advantages of a single pool of RAM) only to then say "just kidding, what we meant was Apple+AMD silicon"...

Honestly, this is nothing but a replay of the CPU argument. Remember how that went
"Apple is on track to exceed Intel performance".
"No they aren't"
"Why not? Give me technical reasons."
"Well Intel has experience. And fabs. And x86 magic pixie dust."

Now we are hearing the same arguments based on anything but actual technology nous regarding the GPU, and soon we're hear yet a third version of it based on how Apple might have a good core but they will not be able to scale it up to a Xeon number of cores because reasons.


(a) Apple does things on a long timetable. Apple is planning for ten years from now. If your worldview extends to three days from now, yes you won't understand what they are doing.

(b) Apple operates on the assumption (largely validated...) that they can do many thing better than anyone else.
Part of it is that they have total control of the hardware, the OS, the compilers, the APIs, so they are willing to add functionality to their CPU/GPU/NPU/... that others wouldn't add because "no existing apps will use it, and no-one will pay for it". Look at them adding the U1 to iPhones. Been sitting there mostly unused for a year; but it's part of that long term plan.
Another part of this is that Apple is willing to drop any parts of the past that hold them back or cause problems. And, sure, that creates a little hassle every year. But the payoff is immense, for Apple, developers, and users. Other companies will not accept that short term pain for the long term gain.

So, by Apple having their own GPU, they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago, or that's there so AMD can sell the GPU as an NPU; while replacing it with eg functionality that accelerates AR (ray tracing?, physics engines?)

This is not about "Apple can save money by not having to pay AMD (or Intel)". It's about Apple wants a rocket to the moon, while AMD and Intel think a daring step forward is design a boat to cross the English Channel.
Apple are using their 'own' GPU IP (I mean, it's heavily tweaked Imagination IP but you get the idea) for the CPUs they use, but for the immediate future they aren't replacing Radeon for dGPUs. Yet.

That could change in 3-4 years, but before then no.
 
  • Like
Reactions: Tlh97 and soresu

soresu

Diamond Member
Dec 19, 2014
3,208
2,480
136
So, by Apple having their own GPU, they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago, or that's there so AMD can sell the GPU as an NPU; while replacing it with eg functionality that accelerates AR (ray tracing?, physics engines?)
You must have missed the point of the CDNA announcement earlier this year.

That architecture divergence essentially creates a non graphics focused compute accelerator.

As for "they can drop various AMD functionality that's only there to support some aspect of Direct3D or OpenGL from ten years ago" - the fact that Vulkan can be translated to Metal (MoltenVK) and DX9/10/11/12 to Vulkan (DXVK and VKD3D) implies that Apple favors supporting exactly the same functionality.

The only difference is that they are doing it in their own particular (and annoying to non Apple exclusive devs) proprietary and controllable platform, so that they can deprecate all OpenGL support from their platform going forward and lose the headaches associated with that support infrastructure.

It should be noted that despite a short bust up early on, MoltenVK has not been blocked by Apple - they are happy to let devs use Vulkan to code for their OS's so long as that code is executed through the Metal API that they control.

Metal is consistent across iOS, iPadOS and Mac OS - they are not magically moving away from mainstream GPU feature set support here, just former mainstream and open API platforms that offer that feature set.

Deprecating those features completely would probably mean certain software ceasing to work, Rosetta or not - including the big apps from DCC giants Adobe and Autodesk.

Likewise Vulkan is already evolving to support ray tracing - no doubt the next increment of Metal will support it also in a similar fashion, likely running on a derivative of the PowerVR Wizard (nee Caustic Labs) traversal/intersection hardware, at least on mobile.

As far as Apple having the R&D capabilities to create a capable GPU, I wouldn't at all say it is impossible - I only said it was unlikely to tackle high end and chiplets at the same time.

I also stated that patents are a major stumbling block to Apple Silicon TM branded GPU's, unless they buy out a major GPU vendor that is.

The fact of the matter is that AMD is doing well enough now that beating them would be no simple matter - therefore any homegrown competitor would have to have a good 25 to 50% advantage for a given node to be worth the investment.

Plus AMD already provides such a strong custom silicon design infrastructure that it is as good as vertically integrated if the customer wants it to be - this fact is borne out by strength of the Sony/MS console silicon relationships with AMD over the last decade.
 
Last edited:
  • Like
Reactions: Tlh97

soresu

Diamond Member
Dec 19, 2014
3,208
2,480
136
This is not about "Apple can save money by not having to pay AMD (or Intel)". It's about Apple wants a rocket to the moon, while AMD and Intel think a daring step forward is design a boat to cross the English Channel.
Also, wuh?

Again - patents, patents, patents.

It was always going to come back to this, it doesn't matter what ambitions they have - patents will always be a giant stumbling block to having their own GPU IP free from another companies fingerprints.

It's why Intel licenses from nVidia, and why Apple have gone back to IMG Tec.

While we're on the subject of IMG Tec, let's just linger there a little on your chosen pedestal of heavenly IT glory, and their particular actions concerning this relatively tiny company.

Here is Apple. A truly giant, trillion dollar valued corporation sweating pennies in royalties paid to a tiny company that they could buy and sell a hundred times over considering their yearly profits - sweating so bad that they purposely devalued it to the point that it had to be bailed out by a Chinese backed venture capital fund, and only then renewed their IP licensing when the company leadership was likely desperate to make a deal, hoping to get IMG Tec's RT acceleration IP for dirt cheap during the new RT gold rush.

Oh man, and here I was thinking that nVidia and Intel had the pole position on crappy business practices in the IT world - Apple sure took the cake here, with extra rot infused frosting on the side.

I can see why you are so hung up on it, from my perspective the sole truly innovative thing that Apple have done in the last 10 years was that ARM core, so I suppose it is bound to stick out.

Everything else was just following in the wake of others with a flashier, more expensive, and usually less impressive (ie cheaper BOM) offering, and often after brushing off the attraction of said features for months to years, before adopting them for their own products and declaring it to be the best thing since sliced bread.

So yeah, real rocket to the moon stuff right there, hold me while I shake in awe at their Einsteinian leaps in ingenuity. /s
 

jpiniero

Lifer
Oct 1, 2010
15,168
5,698
136
I also stated that patents are a major stumbling block to Apple Silicon TM branded GPU's, unless they buy out a major GPU vendor that is.

They don't need dGPUs if they make the IGP powerful enough to fit their needs. Note that their needs don't mean the absolute highest frame rates.
 

soresu

Diamond Member
Dec 19, 2014
3,208
2,480
136
They don't need dGPUs if they make the IGP powerful enough to fit their needs. Note that their needs don't mean the absolute highest frame rates.
The pursuit of an exclusive dual GPU SKU for the recent Mac Pro shows frame rates are not, and have never really been on their minds, excepting perhaps video editing in 4K, 6K, 8K+ resolutions.

Supposedly Otoy are working on a Metal version of Octane Render that will take full advantage of those insane 2x2 GPU Mac Pro's, albeit it has yet to materialise, much like the previously promised but cancelled Vulkan version, and the very short lived OpenCL version (if you can't read my fatalism here you aren't looking very hard).

In short, if they pursue high end dGPU it's for pro/workstation work, though that was always the point of Mac Pro anyway.
 
Last edited: