• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Solved! ARM Apple High-End CPU - Intel replacement

Page 61 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

CluelessOne

Member
Jun 19, 2015
64
40
91
Apple can also license design for USB, WiFi and BT and paste them in their chip. They only need to write drivers for it. Why bother making commodity things?
 
  • Like
Reactions: teejee

Eug

Lifer
Mar 11, 2000
22,826
368
126
Some new Apple battery models have shown up on Asian certification platforms.


Despite some of the new iPhones being rumoured to have larger screens, the leak states they are getting smaller batteries.

For example the 6.5” iPhone 11 Pro Max has a battery capacity of 3969 mAh, but it appears to be 3687 mAh for the rumoured 6.7” iPhone 12 Pro Max.

If true, not surprisingly it would appear that 5 nm along with the new design has significantly improved the efficiency of the low power cores. I might expect Apple might chase such improvements in the Macs too, both for aesthetics and for fan noise reduction.

OTOH, it could be related to other factors such as the new 7 nm Qualcomm modem. Probably multiple factors.
 
Last edited:

name99

Senior member
Sep 11, 2010
240
220
116
Apple can also license design for USB, WiFi and BT and paste them in their chip. They only need to write drivers for it. Why bother making commodity things?
The fact that something is standardized doesn't mean that generic implementations are particularly good. Among the issues are
- the spec only specifies the interoperability parts. So, eg, it will specify the form of ECC that is used, but will not specify the algorithm used to correct a bitstream with errors in it. These algorithms vary widely (all that stuff about Viterbi, soft vs hard decoding, log likelihoods, ...)
Same for extracting info from the MIMO stream.

- the generic implementations may flat out have bugs. Anyone who uses cheap IoT hardware knows that it seems to randomly lose its connection to the host system a whole lot more than does your phone (at least if you have a quality phone)

- the generic implementation probably does a lousy job of saving power, and of security

- with a non-generic implementation Apple can make additions that may not be part of the spec, or may be imaginative uses of the spec that no-one else bothers to support. Aspects of Airpods show this (faster connection times, easy transfer between devices).
A full spec like 802.11ax has a huge amount of stuff, much of which is considered optional. That's why you then get followup semi-standards like WiFi6 which are not exactly the same thing as 802.11ax, more like a subset that the biggest players all agree on as "this is the minimum we will all support for interop". But Apple may look at something in the spec beyond that minimum and feel it wants to use it.

WiFi and BT I think are very much in the category where Apple believes they can add value with their own implementations; it's just been a question of priorities in getting there.
USB I know less about at the lowest level. Maybe that's handled well enough (including performance, power, security) generically that Apple can avoid having any reason for their own?
 

Eug

Lifer
Mar 11, 2000
22,826
368
126
WiFi and BT I think are very much in the category where Apple believes they can add value with their own implementations; it's just been a question of priorities in getting there.
USB I know less about at the lowest level. Maybe that's handled well enough (including performance, power, security) generically that Apple can avoid having any reason for their own?
My Apple devices do seamless handoff with my 5 unit AirPort Extreme WiFi network. It's awesome. Seamless handoff doesn't work for my non-Apple devices, so it's not a true mesh setup, but that's not an issue in my house since my non-Apple devices (media players, TVs, etc.) are stationary anyway*.

Screen Shot 2020-07-13 at 2.14.42 PM.png

Unfortunately, Apple has given up on selling WiFi routers completely, which makes me wonder if some of that apathy about WiFi routers carries over into their other products with WiFi. Maybe they feel that because Broadcom does an excellent job and doesn't have the outrageous contracts that Qualcomm supposedly is famous for, it isn't worth pursuing all the IP required to make their own better modern WiFi silicon.

*There is an exception to the rule. I have an old 2008 MacBook that is too old to run modern versions of macOS, thereby making it relatively useless as a Mac. So I installed Ubuntu on it, and then later installed Chrome OS on it. I guess this shouldn't come as a surprise, but interestingly, seamless handoff works fine in OS X on it, but it doesn't work on either Ubuntu (or other Linux versions) or Chrome.
 

Doug S

Senior member
Feb 8, 2020
321
445
96
OTOH, it could be related to other factors such as the new 7 nm Qualcomm modem. Probably multiple factors.
The display takes up a lot of power, if that has improved efficiency it would help too.

Alternatively, maybe Apple doesn't care if they are among the top phones in battery life like the 11 was and will be willing to reduce the battery size/life a bit to make the 12 thinner/lighter - iPhone have been very slowly getting thicker and heavier year by year since the iPhone 6 generation reached the minimum thickness. Not by a lot, we're talking like 1/20th of an inch, but still.
 

awesomedeluxe

Member
Feb 12, 2020
53
16
41
Die size is the least of my concerns. The issue is the NRE money, not the per-square-mm money.

But your analysis is too simplistic.
What's you story for DRAM? Anything above a MacBook will want at least two memory channels not the iPad's single (128-bit wide) channel.
What's your story for IO (ie all those USB and Thunderbolt ports, HDMI, ethernet, ...)? A14X die? Separate chiplet? Separate chip?
What's your story for GPU? You can grow the logic parts of the GPU "fairly easily" (hah) but at some point you have to deal with the fact that you are in two very different regimes
- thermal
- memory bandwidth
from an iPad.

The issue is not that these are complicated challenges that Apple has no knowledge of how to solve; that's a stupid uninteresting claim. The issue is that these require different sorts of technologies from what's appropriate for an iPhone/iPad, and that's what the discussion is about for the engineers in this forum.

Geekbench 5 is not ideal as a GPU metric, but as a starting point its Metal benchmark has at the high end AMD Radeon Pro Vega II Duo at 97,000 and A12X at 9105. This is not a smear on the A12X, it's a recognition that if Apple wants to match and exceed that AMD number, it will probably need some aspects of the technology AMD uses to get there. That includes a vastly larger power budget -- maybe not AMD's almost 500W!, but even if it's say 120W, that implies something very different from iPad packaging. Likewise it will need some sort of comparable high memory bandwidth technology, something like GDDR or HBM2.
Point is you can't really get there just by tying 4 iPad SoCs together.

How far CAN you get? Well best iMac right now has AMD Radeon Pro 580X. If we say the target for this year is to get to the iMac, then we need to match its 42000. MAYBE you could get there with two iPad SoCs, each running a new A14X GPU a little over twice as fast as the A12X. Lots of hopes there that are not completely impossible but very much on the side of unlikely. And that's the 2019 iMac. There's apparently going to be a new one this year before the ARM iMac; and Apple will want the ARM iMac to substantially beat the Intel iMac, not to be "kinda sorta the same, better in some benchmarks, worse in others"....

So THAT's what this is about.
Once you accept some baseline realities:
- two memory channels for the midrange, at least 4 as you move up
- lotsa IO
- beefy GPUs
then just gluing together iPad SoCs becomes sub-optimal. Gluing together enough to hit GPU goals means you have a lot extra ISPs and media decoders and enclaves and suchlike just wasted on your motherboard.

So what's the alternative? Well that's why you get into questions of perhaps a dedicated Mac SoC? Perhaps chiplets? Perhaps daughterboards?
Certainly gluing together lots of iPad SoCs COULD be made to work, probably well enough to still beat Intel, and still cheaply enough to not be a problem. But it wouldn't be engineering optimal. So that's the question -- is something closer to engineering optimal cheap enough to be feasible, which is what my numbers were all about.
You put a lot of effort into your analysis but just wrote off what I said as "gluing a bunch of iPad SoCs together and shoving them in an iMac." I never suggested gluing iPad SoCs together in any context. The only thing I said about the iMac is that it might use a different SoC than the iPad. I'm happy to discuss ideas with you, especially because you seem pretty knowledgeable about computer engineering, but please pay more attention to what I am actually sharing.

I'll explain what I meant in more detail, and address memory and I/O on the way. I'd appreciate your thoughts.

Take the "A14Z" APU and give it an optional pin out. Obviously, the iPad is not going to use that. I'd probably have the iPad's memory layered on, and just not do that for machines using the pin out. Now you can put that same APU in say, the Macbook 16. Here's what that looks like.

In the MBP, the APU connects to one side of an I/O die. The die has 32GB of HBM2E. On the other side, the die connects to an Apple GPU.

The Apple GPU would have something like 32 cores. They lean into efficiency as much as possible; I don't see them clocked higher than they are in the iPad right now. The gap between the A12X and the 5600M is small enough that multiplying the number of cores by 4.5 might cover it.

The big idea is that all three of these dies go the same package. So TSMC manufactured a jillion A14Zs with slightly different configurations and some of them go to the iPad and some of them go onto a different package. What I can't tell you is what kind of high-speed interposer connects these dies; I'm just kind assuming Apple can figure that out.

The only thing I suggested for the iMac and Mac Pro is that it would use a totally different die, which could be an octocore firestorm part without the extra baggage, and Apple would just surround an I/O die with however many of those it deems appropriate. Not really that attached to that idea but that's what I meant the iMac and Mac Pro could use a Threadripper-style part.
 

soresu

Golden Member
Dec 19, 2014
1,411
584
136
A full spec like 802.11ax has a huge amount of stuff, much of which is considered optional. That's why you then get followup semi-standards like WiFi6 which are not exactly the same thing as 802.11ax, more like a subset that the biggest players all agree on as "this is the minimum we will all support for interop". But Apple may look at something in the spec beyond that minimum and feel it wants to use it.
At this point 11ax is already pretty capable of supporting a full landline broadband connection with just a single stream (1100-1200 mbps) for all but the most hardcore fiber customers.

Anything beyond that (ie MIMO 2x2) is just going to drain the battery even when using it in the same room as the router/access point.

By the time true FTTH fiber broadband becomes commonplace we will have the 11be/WiFi 7 standard, which will probably give 3-4x the bandwidth per stream of 11ax.

Not that there are really many uses for such insanely high download speeds mind you, excepting perhaps something like wireless VR connections using something like VESA DSC to conserve bandwidth.

I'm still waiting on the promise from years ago of 'passive WiFi' using backscatter of ambient signals for ultra low power transmission.

With that even an ancient 11b modem could field full lossless high res audio (18 mbps) to wireless headphones at very low power (microwatts) rather than lossy compressed audio codecs used for Bluetooth and BLE audio standards up till now.

Even aside from uses in wireless headphones it would be a huge power consumption win for mobile devices in WiFi environments.
 

Eug

Lifer
Mar 11, 2000
22,826
368
126
Not that there are really many uses for such insanely high download speeds mind you, excepting perhaps something like wireless VR connections using something like VESA DSC to conserve bandwidth.
It's nice having cloud storage speeds that are as fast as your internal Gigabit network at home, although truthfully these days my big concern is upload. I already have an insane download speed, but I don't have such an insane upload speed.

I'm now considering switching back from my current cable connection - 1000 Mbps down / 30 Mbps up - to FTTH since they are now offering near-symmetric upload and download speeds. When I left FTTH it was 940 Mbps down and 100 Mbps up. Now they are offering 1500 down / 940 up, although I'd probably something get like 1000/750 or else 500/500 to save a few bucks, esp. since after 500 Mbps, there are a lot of other bottlenecks out there, including WiFi speeds and the fact that my routers are speed limited in PPPoE mode (which is what my FTTH provider uses). Furthermore, some of my older machines are not fast enough to achieve Gigabit download speeds, esp. if I have anti-virus active.

But the main point is I have 2 TB of cloud storage and I tell ya, doing large backups to the cloud at 30 Mbps is really, really painful. Internet service at 500/500 would be great and would actually be quite useful in the real world in this context.
 
Last edited:

soresu

Golden Member
Dec 19, 2014
1,411
584
136
I already have an insane download speed, but I don't have such an insane upload speed.
Future DOCSIS cable iterations will have symmetric multi gigabit speeds.

FTTH is obviously the ideal and all, but is unlikely to be deployed nearly so quick if it is not already piped into something in your area.

I agree though, upload speeds have lagged far too much for landline connections.
 

name99

Senior member
Sep 11, 2010
240
220
116
You put a lot of effort into your analysis but just wrote off what I said as "gluing a bunch of iPad SoCs together and shoving them in an iMac." I never suggested gluing iPad SoCs together in any context. The only thing I said about the iMac is that it might use a different SoC than the iPad. I'm happy to discuss ideas with you, especially because you seem pretty knowledgeable about computer engineering, but please pay more attention to what I am actually sharing.
Fair enough. There are lots of people in this thread and one sometimes is replying (at least conceptually) to more than one of them at once.
In particular there have been people saying everything from Apple will use iPhone SoCs in MacBooks up to people who can't see a problem with just slapping 8 A14X's on a PCB to power a Mac Pro.

I'll explain what I meant in more detail, and address memory and I/O on the way. I'd appreciate your thoughts.

Take the "A14Z" APU and give it an optional pin out. Obviously, the iPad is not going to use that. I'd probably have the iPad's memory layered on, and just not do that for machines using the pin out. Now you can put that same APU in say, the Macbook 16. Here's what that looks like.

In the MBP, the APU connects to one side of an I/O die. The die has 32GB of HBM2E. On the other side, the die connects to an Apple GPU.

The Apple GPU would have something like 32 cores. They lean into efficiency as much as possible; I don't see them clocked higher than they are in the iPad right now. The gap between the A12X and the 5600M is small enough that multiplying the number of cores by 4.5 might cover it.

The big idea is that all three of these dies go the same package. So TSMC manufactured a jillion A14Zs with slightly different configurations and some of them go to the iPad and some of them go onto a different package. What I can't tell you is what kind of high-speed interposer connects these dies; I'm just kind assuming Apple can figure that out.

The only thing I suggested for the iMac and Mac Pro is that it would use a totally different die, which could be an octocore firestorm part without the extra baggage, and Apple would just surround an I/O die with however many of those it deems appropriate. Not really that attached to that idea but that's what I meant the iMac and Mac Pro could use a Threadripper-style part.
I'd classify that as a chiplet option, and I'd agree that highest likelihood on my list of the "medium-term" solutions. But will we see it for this year? That's where we might differ because my expectation is that this year we get laptops through mini up to the iMac, but not to iMac Pro and Mac Pro levels. This is kinda a strange learning year where the priority is getting it right rather than getting it as cheap as possible. In that context, does it make sense to throw in the additional complication of chiplets, rather than sticking with known technology (albeit perhaps expanded to a rather larger die, perhaps even up to 250mm^2 or so) for this round of learning?

Perhaps where we differ most strenuously (or perhaps where we each misunderstood each other) is that I don't see iMac Pro and Mac Pro happening this year, so for them I think we have even less on which to base our predictions; we can recalibrate for next year once we see what ships this year.
 

Eug

Lifer
Mar 11, 2000
22,826
368
126
Future DOCSIS cable iterations will have symmetric multi gigabit speeds.

FTTH is obviously the ideal and all, but is unlikely to be deployed nearly so quick if it is not already piped into something in your area.

I agree though, upload speeds have lagged far too much for landline connections.
FTTH is all over our city (Toronto). Most of the new neighbourhoods get it, as do the newer condos. Most of the older neighbourhoods also get it because they're old enough to have telephone poles for electricity, so they run fibre along them too. That's why my entire area has FTTH access, despite being built 70 years ago. The ones left out are those with the not too recent but not too old homes, like those built in the 1980s, since they all had buried lines and no telephone poles. Those places that don't get it have FTTN.

Anyhow, I've been pleasantly surprised at just how quick these cloud solutions have become commonplace. Apple makes a fair amount off their various services now, and I've been suckered into getting a 2 TB iCloud plan, so my 45000 pictures and videos are all synced across my various devices - iPhone, iPad Pro, iMac, MacBook, as are my important documents, without using much local storage (except on my iMac, where I keep a local copy).

I used to pine for better iPad Pro external USB file support, but it's becoming less critical these days. I still want it though, since sneaker net is still important for some people.

Also, to bring things back on topic, I think probably for most of the entry level users, integration of features such as this can be much more enticing than a faster A14 series CPU. For example, I recently sold a 2019 MacBook Air (that we won in a raffle - yay!) with just 128 GB storage. The buyer was initially a little concerned about the storage, but iCloud solved that completely. CPU speed wasn't a consideration at all, since even that old entry level MacBook Air these days handles all mainstream business type apps just fine, and even does light 4K video editing OK.

That 2019 machine had a Core i5 1.6 GHz i5-8210Y. Geekbench 5 score is about 840/1750, which is only about 10% faster than my 2017 Core m3-7Y32, and is actually slower (if you can compare cross-platform) than my 2017 A10X iPad Pro and the 2017 A11 iPhone 8. And I actually bought an A10 iPad in 2019 for my wife, and she loves it. I suspect that A10 performance will not be an issue for her for several years.
 

name99

Senior member
Sep 11, 2010
240
220
116
At this point 11ax is already pretty capable of supporting a full landline broadband connection with just a single stream (1100-1200 mbps) for all but the most hardcore fiber customers.

Anything beyond that (ie MIMO 2x2) is just going to drain the battery even when using it in the same room as the router/access point.

By the time true FTTH fiber broadband becomes commonplace we will have the 11be/WiFi 7 standard, which will probably give 3-4x the bandwidth per stream of 11ax.

Not that there are really many uses for such insanely high download speeds mind you, excepting perhaps something like wireless VR connections using something like VESA DSC to conserve bandwidth.

I'm still waiting on the promise from years ago of 'passive WiFi' using backscatter of ambient signals for ultra low power transmission.

With that even an ancient 11b modem could field full lossless high res audio (18 mbps) to wireless headphones at very low power (microwatts) rather than lossy compressed audio codecs used for Bluetooth and BLE audio standards up till now.

Even aside from uses in wireless headphones it would be a huge power consumption win for mobile devices in WiFi environments.
People use connectivity for more than just internet...
Apple was shipping GigE across the line way before you could get gig Internet. They're now shipping 10G ethernet across some of the line.

I have told you this repeatedly, but you insist on ignoring it. Apple is NOT just another PC company. Every time you analyze Apple, you land up describing what PC companies do and insisting Apple will do the same, and that's not a good methodology for understanding them.
You imagine that Apple is motivated by money rather than by making better hardware. (Sure they want to make money, but they believe that the way to do that is to produce better hardware, not cheaper lousier just good-enough hardware.)
You imagine that Apple will survey the compute landscape and conclude that cores, or connections, are fast enough, so no need to put any effort into that. Wrong!

Meanwhile I look at Apple and imagine that they are probably asking "Where's the damn 802.11ay silicon? Is this ANOTHER damn spec we will have to implement ourselves because the rest of the industry doesn't have the imagination to see why this could be useful?"
(cf comments about Apple Lidar or Apple UWB)
 

name99

Senior member
Sep 11, 2010
240
220
116
My Apple devices do seamless handoff with my 5 unit AirPort Extreme WiFi network. It's awesome. Seamless handoff doesn't work for my non-Apple devices, so it's not a true mesh setup, but that's not an issue in my house since my non-Apple devices (media players, TVs, etc.) are stationary anyway*.

View attachment 25723

Unfortunately, Apple has given up on selling WiFi routers completely, which makes me wonder if some of that apathy about WiFi routers carries over into their other products with WiFi. Maybe they feel that because Broadcom does an excellent job and doesn't have the outrageous contracts that Qualcomm supposedly is famous for, it isn't worth pursuing all the IP required to make their own better modern WiFi silicon.

*There is an exception to the rule. I have an old 2008 MacBook that is too old to run modern versions of macOS, thereby making it relatively useless as a Mac. So I installed Ubuntu on it, and then later installed Chrome OS on it. I guess this shouldn't come as a surprise, but interestingly, seamless handoff works fine in OS X on it, but it doesn't work on either Ubuntu (or other Linux versions) or Chrome.
I get what you are saying, but there's a difference between a WiFi chip (and the capabilities it gives you) and a WiFi base station. A WiFi base station is more of a commodity thing, though, true, Apple can (and did) add Apple specific functionality -- stuff like their early audio out to AirPort Express, or the USB to a printer/backup drive are further examples of your point.
(I recently tried an 802.11ax Nighthawk, which I returned bcs it's a piece of garbage along multiple dimensions, but I was amazed that it still couldn't get something I'd consider trivial by now, namely the attached USB disk, to work properly.)

It's possible that Apple thought WiFi router functionality was well enough handled by the market that there was little customer benefit to continuing to spend effort there (cf, eg, them getting out of the printer business, or their short lived cameras). They got out of displays, then returned when they concluded that, at least at the high end, what the market offered wasn't good enough; so they could return to routers. But my guess is that they consider a company like Amplifi to be selling something good enough, with a close enough Apple experience, that it's not worth doing?

BTW lucky you that all your AirPort base stations still work! I have enough of them that I could do something like that, but unfortunately they've all died along one aspect or another of functionality! This one has 2.4GHz not working, that one has 5GHz, not working, this other one has all the ethernet ports not working, ...
 

soresu

Golden Member
Dec 19, 2014
1,411
584
136
since sneaker net is still important for some people.
Que?
Anyhow, I've been pleasantly surprised at just how quick these cloud solutions have become commonplace. Apple makes a fair amount off their various services now, and I've been suckered into getting a 2 TB iCloud plan, so my 45000 pictures and videos are all synced across my various devices - iPhone, iPad Pro, iMac, MacBook, as are my important documents, without using much local storage (except on my iMac, where I keep a local copy).
A 2TB SSD is actually pretty cheap nowadays (relatively speaking compared to a few years ago).

Even 4TB 2.5 inch SATA SSD's are in the range of what I would call affordable now - I'd get more use out of it than a £400-500 gfx card anyway.

Given that unless you have serious concerns about backups and data security I don't see the point of relying on cloud storage - it's all well and good if you are at home with that potential FTTH connection, but as soon as you go on the move you are hit by mobile data caps and/or the dross of public wifi.

The concept is sound it's just the mobile access that sucks - that's the fault of the life sucking telecomm's giants though, I don't even see the point in having 5G when 4G can easily break your mobile plans data cap in minutes to be honest.
 
Last edited:

Eug

Lifer
Mar 11, 2000
22,826
368
126
Sneaker net, as in colleagues walking over from their office to mine with a USB drive in hand.

Given that unless you have serious concerns about backups and data security I don't see the point of relying on cloud storage - it's all well and good if you are at home with that potential FTTH connection, but as soon as you go on the move you are hit by mobile data caps and/or the dross of public wifi.

The concept is sound it's just the mobile access that sucks - that's the fault of the life sucking telecomm's giants though, I don't even see the point in having 5G when 4G can easily break your mobile plans data cap in minutes to be honest.
I have multiple different types backups, and I now have complete cloud support added to the mix.

My reasoning to get 5G is to hopefully get better consistency with decent speeds in congested areas. I'm not so concerned about maximum speeds. Plus, I keep my devices for a long time. If I buy a 5G device in 2020, I will still likely have that same device in 2023.

My cellular plan is 15 GB plus extra per month which should suffice for my usage.
 
Last edited:

soresu

Golden Member
Dec 19, 2014
1,411
584
136
My current storage plan is 15 GB plus extra per month which suffice for my usage.
This is precisely why I hate mobile service providers, I have friends who regularly use 1TB+ a month on their landlines with nary a peep from the ISP.
 

LightningZ71

Senior member
Mar 10, 2017
496
411
106
I hate to break it to those of you that are patiently waiting for FTTH, but, unless you live in a sparsely populated area, it looks like some of the big ISPs are halting their fiber roll outs and choosing to use a wireless solution. To give you an example, AT&T was laying fiber in my city and the suburbs. About a year ago, they halted the project, with dark fiber strung up and tunneled in several neighborhoods. I met one of the contractors that was working the project and he told me that they have decided to abandon FTTH and are instead moving to a neighborhood "5G" based solution with wireless access points at every other corner or so and customers renting wireless modems. This reduces their fixed maintenance costs as they only have to maintain 5% of the fiber plant (only neighborhood haul to the WAPs) and don't have to handle last mile to the home. I had this later confirmed by a actual AT&T tech that was doing a commercial install at a building I support.

As for Apple, and the point of this thread, I agree that we likely won't see any "Pro" products before the third quarter of next year at the earliest. Anything below that level can be handled by whatever the A14 looks like, be it the base 2+4 if they retain that, or a 4+4 for a X/Z edition.
 

awesomedeluxe

Member
Feb 12, 2020
53
16
41
I'd classify that as a chiplet option, and I'd agree that highest likelihood on my list of the "medium-term" solutions. But will we see it for this year? That's where we might differ because my expectation is that this year we get laptops through mini up to the iMac, but not to iMac Pro and Mac Pro levels. This is kinda a strange learning year where the priority is getting it right rather than getting it as cheap as possible. In that context, does it make sense to throw in the additional complication of chiplets, rather than sticking with known technology (albeit perhaps expanded to a rather larger die, perhaps even up to 250mm^2 or so) for this round of learning?

Perhaps where we differ most strenuously (or perhaps where we each misunderstood each other) is that I don't see iMac Pro and Mac Pro happening this year, so for them I think we have even less on which to base our predictions; we can recalibrate for next year once we see what ships this year.
I actually don't see Apple getting a lot of ARM machines out in 2020, including either of the models you mentioned in the last paragraph. I suspect Apple will be running up against TSMC's capacity because I think they want all of this on N5 or better and N5 will be very busy making A14 chips for the iPhone 12. I'm open to lots of ideas, but if I were a betting man I'd put my chips (hah) on the MBP13 and the Mac Mini.

I think they will both use the 8+4 design described by Bloomberg, with 16 graphics cores and a pin out for memory. We'll call it the A14Z. I'd guess the A14Z and the A14 are the only SoCs this year.

The new MBP13 would not require a chassis overhaul, but given that one is overdue I wouldn't consider a slimmer design out of the question. Perf cores can clock up to ~3Ghz. Pin out to LPDDR5 is all that is necessary for memory. This could trounce Tiger Lake by large enough margins to draw tech press headlines - better multicore, better single core, better graphics.

The Mac Mini can use the same part. Cores are tuned up until they're using nearly 4W/core fully loaded, and of course, it's hooked up to high speed DDR5. Superior multicore performance than its hexacore predecessor, and the best graphics performance the line has seen by a long shot. Single core is a question mark; if you buy this benchmark things will be fine, but I'm still a little worried about how Firestorm cores will scale up against things with 5GHz boost clocks.

I hope we will see chiplet designs in 2021 starting on N5P. That would be awesome, but maybe too optimistic. The 16" is probably the best candidate. I'll be a little disappointed if it is using an AMD graphics solution. The iMac could go either way, and I wouldn't be surprised if it has AMD Inside. Taking on AMD's Radeon Pro 580X successor with a chiplet design is a little ambitious.
 

Eug

Lifer
Mar 11, 2000
22,826
368
126
As for Apple, and the point of this thread, I agree that we likely won't see any "Pro" products before the third quarter of next year at the earliest. Anything below that level can be handled by whatever the A14 looks like, be it the base 2+4 if they retain that, or a 4+4 for a X/Z edition.
I guess it depends on what you mean by "Pro", but analyst extraordinaire claims the first Arm Mac will be a 13" MacBook Pro, with the MacBook Air to follow. He claims that production of the MBP will begin Q4 2020, with the MacBook Air to follow shortly after with production beginning Q4 2020 or Q1 2021. Note however, Apple often doesn't release a product until several months after production starts, for obvious reasons.

I think a 4+4 MBP would be fine at the 13" (or 14"?) size, and I'm personally predicting the MacBook Air (or MacBook?) will be 2+4. I do think higher end iMacs and higher end MacBook Pros will be 6+4 and/or 8+4 though.
 

awesomedeluxe

Member
Feb 12, 2020
53
16
41
I guess it depends on what you mean by "Pro", but analyst extraordinaire claims the first Arm Mac will be a 13" MacBook Pro, with the MacBook Air to follow. He claims that production of the MBP will begin Q4 2020, with the MacBook Air to follow shortly after with production beginning Q4 2020 or Q1 2021. Note however, Apple often doesn't release a product until several months after production starts, for obvious reasons.

I think a 4+4 MBP would be fine at the 13" (or 14"?) size, and I'm personally predicting the MacBook Air (or MacBook?) will be 2+4. I do think higher end iMacs and higher end MacBook Pros will be 6+4 and/or 8+4 though.
I appreciate tempered expectations but isn't this a little too conservative? The iPad has four performance cores... why would the Air only use 2? Not to mention the current Air is available in a four-core configuration...? No one is going to accept Icestorm cores as a substitute for Sunny Cove cores.

MBP 13 also has a 28W SoC. These cores don't need more than 2W a piece and it's not like you need to be using all of them all of the time.
 

Eug

Lifer
Mar 11, 2000
22,826
368
126
I appreciate tempered expectations but isn't this a little too conservative? The iPad has four performance cores... why would the Air only use 2?
Nitpick, but the iPad only has 2 performance cores. It's the iPad Pro that has 4 performance cores.

Not to mention the current Air is available in a four-core configuration...? No one is going to accept Icestorm cores as a substitute for Sunny Cove cores.
Well, it depends on who your customers are. MacBook Air customers aren't typically performance junkies, and most of them don't even know what CPUs are in their machines.

Plus, if the A14 core count followed that of the A13, it would actually have 6 cores, not 2. Yeah, only 2 of them are performance cores, but if Apple decides core count actually matters for marketing purposes, they'll market it appropriately.

Don't get me wrong. I would love to see Macs start at iPad Pro speed, but I am just not optimistic that will happen. Just as importantly, I don't think there is really any necessity for it to happen from a performance perspective.

Going back to my first statement, the iPad sells just fine with a 2 performance core part, even for the mid-tier version, the iPad Air. A12 performance is quite decent for that category, and A14 would be much, much faster. And as mentioned previously, my wife is quite pleased with the performance of her A10 iPad. I bought her that machine in 2019, despite its SoC being a 2016 part. I waited until 2019 to buy her this because of RAM, since at 3 GB, it has 50% more RAM than its 2017 and 2018 predecessors. The SoC speed is of secondary concern IMO.

The other possibility of course is Apple could sell older parts. Maybe you are right and they will use a 4 performance core SoC in the MacBook/MacBook Air. However, that SoC could be A12Z, but they would put A14X in the lower tier 13/14" MacBook Pros.
 
Last edited:

awesomedeluxe

Member
Feb 12, 2020
53
16
41
Nitpick, but the iPad only has 2 performance cores. It's the iPad Pro that has 4 performance cores.


Well, it depends on who your customers are. MacBook Air customers aren't typically performance junkies, and most of them don't even know what CPUs are in their machines.

Plus, if the A14 core count followed that of the A13, it would actually have 6 cores, not 2. Yeah, only 2 of them are performance cores, but if Apple decides core count actually matters for marketing purposes, they'll market it appropriately.

Don't get me wrong. I would love to see Macs start at iPad Pro speed, but I am just not optimistic that will happen. Just as importantly, I don't think there is really any necessity for it to happen from a performance perspective.

Going back to my first statement, the iPad sells just fine with a 2 performance core part, even for the mid-tier version, the iPad Air. A12 performance is quite decent for that category, and A14 would be much, much faster. And as mentioned previously, my wife is quite pleased with the performance of her A10 iPad. I bought her that machine in 2019, despite its SoC being a 2016 part. I waited until 2019 to buy her this because of RAM, since at 3 GB, it has 50% more RAM than its 2017 and 2018 predecessors. The SoC speed is of secondary concern IMO.

The other possibility of course is Apple could sell older parts. Maybe you are right and they will use a 4 performance core SoC in the MacBook/MacBook Air. However, that SoC could be A12Z, but they would put A14X in the lower tier 13/14" MacBook Pros.
Oh man, an A12Z!! Gracious. I'm hoping you wind up pleasantly surprised by what Apple rolls out instead of me being cripplingly disappointed.

IMO, Apple will want to update the iPad Pro 1H 2022 with a new 4+4-and-8 SoC anyway. So why not use one they develop for the Air in 2021? Hell, I don't even think you need an optional pin out for the Air's SoC. If the Galaxy 20 can layer on up to 12GB of LPDDR5, I'm sure this theoretical A14X could do the same.

And this is appropriate for the price. The iPad 11 and 12 start at $800 and $1,000 respectively; the Air starts at $1,000. The two core iPad you brought up starts at $400.

Now, I don't disagree that an A14 would be viable. It won't get slaughtered by Tiger Lake, but it won't be a triumphant success story either. And if AMD is shipping a Renoir U successor in any kind of volume, well, good night.

Why take the risk when you could be dominating the headlines with stories of how your A14X takes the competition to the cleaners? Isn't that what you want every headline for your new machines to say, every year, forever and ever, just like the iPhone? That's easily achievable, and it's good business to be the best.

You'd still be bringing home a hefty margin on a $1,000 machine. It's not like your A14X is costing more per part than Intel wanted, and you're perfectly primed to reuse your 4+4 SoC in two other products the following year.
 
  • Like
Reactions: Eug

Eug

Lifer
Mar 11, 2000
22,826
368
126
Oh man, an A12Z!! Gracious. I'm hoping you wind up pleasantly surprised by what Apple rolls out instead of me being cripplingly disappointed.

IMO, Apple will want to update the iPad Pro 1H 2022 with a new 4+4-and-8 SoC anyway. So why not use one they develop for the Air in 2021? Hell, I don't even think you need an optional pin out for the Air's SoC. If the Galaxy 20 can layer on up to 12GB of LPDDR5, I'm sure this theoretical A14X could do the same.

And this is appropriate for the price. The iPad 11 and 12 start at $800 and $1,000 respectively; the Air starts at $1,000. The two core iPad you brought up starts at $400.

Now, I don't disagree that an A14 would be viable. It won't get slaughtered by Tiger Lake, but it won't be a triumphant success story either. And if AMD is shipping a Renoir U successor in any kind of volume, well, good night.

Why take the risk when you could be dominating the headlines with stories of how your A14X takes the competition to the cleaners? Isn't that what you want every headline for your new machines to say, every year, forever and ever, just like the iPhone? That's easily achievable, and it's good business to be the best.

You'd still be bringing home a hefty margin on a $1,000 machine. It's not like your A14X is costing more per part than Intel wanted, and you're perfectly primed to reuse your 4+4 SoC in two other products the following year.
It was mentioned earlier in the thread but the iPad Pro starts with 128 GB (and will likely have the same 128 GB in the next iteration), and does not include a keyboard or trackpad. That Magic Keyboard costs $300-350. Also, the $800 model is only 11". The more comparable size is the 12.9" model.

The MacBook Air comes with not only an integrated keyboard and trackpad, it also comes with 256 GB storage, and the latest model even includes a True Tone screen and 2 Thunderbolt ports. And if you upgrade to quad-core, you also get 512 GB storage. So:

MacBook Air dual-core i3
256 GB storage
Retina screen with True Tone
8 GB RAM
Magic Keyboard with Force Touch trackpad
Touch ID
Two Thunderbolt 3 ports
US$999

MacBook Air quad-core i5
512 GB storage
Retina screen with True Tone
8 GB RAM
Magic Keyboard with Force Touch trackpad
Touch ID
Two Thunderbolt 3 ports
US$1299

iPad Pro 12.9"
256 GB storage
Retina screen with True Tone and touch support
6 GB RAM
Magic Keyboard with trackpad (not Force Touch)
Face ID
One USB-C 3 port and one charging port (in Magic Trackpad)
$1448

iPad Pro 12.9"
512 GB storage
Retina screen with True Tone and touch support
6 GB RAM
Magic Keyboard with trackpad (not Force Touch)
Face ID
One USB-C 3 port and one charging port (in Magic Trackpad)
$1648

IOW, to get a similarly configured iPad Pro, it costs $349 to $449 more than the MacBook Air. At entry level, that $449 represents a whopping 45% increase in price.

And ironically, the iPad Pro 12.9" with Magic Keyboard actually weighs more than the MacBook Air.

I suppose yet another possibility is Apple could sell a base model MacBook Air with A12 or A14, but the upgraded MacBook Air model with 512 GB storage would have A12X/Z or A14X.

I personally wouldn't want either though. I'm too spoiled by my 12" MacBook and don't want anything bigger.
 

awesomedeluxe

Member
Feb 12, 2020
53
16
41
It was mentioned earlier in the thread but the iPad Pro starts with 128 GB (and will likely have the same 128 GB in the next iteration), and does not include a keyboard or trackpad. That Magic Keyboard costs $300-350. Also, the $800 model is only 11". The more comparable size is the 12.9" model.

The MacBook Air comes with not only an integrated keyboard and trackpad, it also comes with 256 GB storage, and the latest model even includes a True Tone screen and 2 Thunderbolt ports. And if you upgrade to quad-core, you also get 512 GB storage. So:

MacBook Air dual-core i3
256 GB storage
Retina screen with True Tone
8 GB RAM
Magic Keyboard with Force Touch trackpad
Touch ID
Two Thunderbolt 3 ports
US$999

MacBook Air quad-core i5
512 GB storage
Retina screen with True Tone
8 GB RAM
Magic Keyboard with Force Touch trackpad
Touch ID
Two Thunderbolt 3 ports
US$1299

iPad Pro 12.9"
256 GB storage
Retina screen with True Tone and touch support
6 GB RAM
Magic Keyboard with trackpad (not Force Touch)
Face ID
One USB-C 3 port and one charging port (in Magic Trackpad)
$1448

iPad Pro 12.9"
512 GB storage
Retina screen with True Tone and touch support
6 GB RAM
Magic Keyboard with trackpad (not Force Touch)
Face ID
One USB-C 3 port and one charging port (in Magic Trackpad)
$1648

IOW, to get a similarly configured iPad Pro, it costs $349 to $449 more than the MacBook Air. At entry level, that $449 represents a whopping 45% increase in price.

And ironically, the iPad Pro 12.9" with Magic Keyboard actually weighs more than the MacBook Air.

I suppose yet another possibility is Apple could sell a base model MacBook Air with A12 or A14, but the upgraded MacBook Air model with 512 GB storage would have A12X/Z or A14X.

I personally wouldn't want either though. I'm too spoiled by my 12" MacBook and don't want anything bigger.
Looking at your price breakdown I think we could make the case either way for whether or not the Air is "worth" an A14X. It may come down to timing; if Apple wanted to get the Air out this year the A14 is readily available and the specter of Cezanne U is probably half a year out. I'm less certain about using the A12Z as a CPU/GPU upgrade path - I think this would be hard to position as an "upgrade" for a lot of reasons. You can still sell the user more RAM and HD space for egregious amounts. You probably need an optional pinout on the A14 which strikes me as a little weird but Apple can figure it out.

If Apple does want the A14 in the Air, there was a scenario that occurred to me that makes it a lot more palatable. Up until now I've been assuming the A14 uses the same 2+4-and-4 design as the A13, perf cores clocked to max, and while this is still what I would consider most likely I don't think it has to be true.

Apple could add two more GPU cores and make the A14 2+4-and-6. I don't know exactly how much power each GPU core currently uses, but given that system active power tops out around 6W with an iPhone 11 fresh out of the freezer, I'd assume less than 1W at full speed. N5 offers considerably more density and 22% power savings over N7P. My back of the hand math says that if you go full efficiency with the graphics cores and clock the perf cores just a little less agressively than N5 allows, say maybe 5% instead of 7%, you can get to the same place with two more GPU cores in the mix.

I don't think Apple would go this route; I think a phone has better things it can do with the extra margin. But it would be ideal for the Air, which has more pixels to push and is going to appreciate the extra GPU cores when the user opens Chrome or Photoshop.
 

Eug

Lifer
Mar 11, 2000
22,826
368
126
If Apple does want the A14 in the Air, there was a scenario that occurred to me that makes it a lot more palatable. Up until now I've been assuming the A14 uses the same 2+4-and-4 design as the A13, perf cores clocked to max, and while this is still what I would consider most likely I don't think it has to be true.

Apple could add two more GPU cores and make the A14 2+4-and-6. I don't know exactly how much power each GPU core currently uses, but given that system active power tops out around 6W with an iPhone 11 fresh out of the freezer, I'd assume less than 1W at full speed. N5 offers considerably more density and 22% power savings over N7P. My back of the hand math says that if you go full efficiency with the graphics cores and clock the perf cores just a little less agressively than N5 allows, say maybe 5% instead of 7%, you can get to the same place with two more GPU cores in the mix.

I don't think Apple would go this route; I think a phone has better things it can do with the extra margin. But it would be ideal for the Air, which has more pixels to push and is going to appreciate the extra GPU cores when the user opens Chrome or Photoshop.
Doesn't A13 already have better GPU performance than most integrated GPUs?

I'm asking because I don't follow the GPU side as closely, but I thought that was the case. However, I do know that I have no complaints about the GPU performance of my A10X at 2224x1668, and my wife has no complaints about the GPU performance of even her much slower A10 at 2160x1620. The MacBook Air is 2560x1600, which is only 10% more pixels than my iPad Pro 10.5.
 

ASK THE COMMUNITY