• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion Apple Silicon SoC thread

Page 446 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:


M5 Family discussion here:

 
Last edited:
Some people here might find this interesting:
<a href="https://stack.int.mov/a-reverse-engineers-anatomy-of-the-macos-boot-chain-security-architecture/">https://stack.int.mov/a-reverse-engineers-anatomy-of-the-macos-boot-chain-security-architecture/</a>

It's an EXTREMELY detailed description of every step of the Apple boot process. It's astonishing (to me) the amount of effort they have put into this, and the number of layers and details involved. As far as I know it "works", in the sense that there have been no exploits against the elements protected in this way. Of course there are exploits, from the inevitable phishing/social engineering to bugs in Safari, or Messages, or even the OS proper. But nothing catastrophic (eg page table manipulation).
And of course they're not done; I assume at some point things like Safari will be refactored to further limit bug propagation as (I believe) has already happened to some extent with Messages. (The first version was not great, but the second version based on "lessons learned" so far seems to be holding up?)
 
Some people here might find this interesting:
<a href="https://stack.int.mov/a-reverse-engineers-anatomy-of-the-macos-boot-chain-security-architecture/">https://stack.int.mov/a-reverse-engineers-anatomy-of-the-macos-boot-chain-security-architecture/</a>

It's an EXTREMELY detailed description of every step of the Apple boot process. It's astonishing (to me) the amount of effort they have put into this, and the number of layers and details involved. As far as I know it "works", in the sense that there have been no exploits against the elements protected in this way. Of course there are exploits, from the inevitable phishing/social engineering to bugs in Safari, or Messages, or even the OS proper. But nothing catastrophic (eg page table manipulation).
And of course they're not done; I assume at some point things like Safari will be refactored to further limit bug propagation as (I believe) has already happened to some extent with Messages. (The first version was not great, but the second version based on "lessons learned" so far seems to be holding up?)
I know someone who was part of the design/development of ApplePay. What they have told me is that Apple has LONG believed that they can only achieve these security benefits through vertical control. I mean, the biggest takeaway from the Jobs led resurgence of Apple was that they can't trust their industry partners, they can only trust what they control because their partners priorities were not their priorities, and if their partners had customers security was always going to be the lowest common denominator of all of their customers, rather than whatever standard Apple had (see the notes in that documents regarding how far apart AS MacOS and Intel MacOS are). It's why ApplePay got developed the way it did and why Apple contributed quite a bit to the EMV contactless standard.

And in the post Jobs era, Apple's primary focus was user trust and security got a much bigger focus in the company. Cook pushed that even further seeing it as a market advantage.
 
I mean, the biggest takeaway from the Jobs led resurgence of Apple was that they can't trust their industry partners, they can only trust what they control because their partners priorities were not their priorities, and if their partners had customers security was always going to be the lowest common denominator of all of their customers, rather than whatever standard Apple had (see the notes in that documents regarding how far apart AS MacOS and Intel MacOS are)

That is always the case, and proven many times. How often do you read that some big company had hackers inside their network and it turned out they got in via a third party supplier that had inadequate security? Sure you might pay lots of attention to your IT outsourcer but it might not be them, it could be corporate support for some old large format dye sub printer in a basement somewhere that has direct access to the PC attached to it to maintain the printer (yeah there's a reason why I am so specific since I once wrote an RCA for exactly that - and it turned out their outside security auditor had flagged it every year for almost a decade but was ignored because the overall audit score was very high and that's all management cared about)

I know some people like to argue that Apple is no different than anyone else, and compromises your privacy just as much as Google even though they make a pittance of their overall profit from advertising while Google makes over 100% of their profit from advertising (or at least they still did as of a few years ago, maybe their cloud business is finally profitable enough to overcome the losses made in all their non-advertising businesses) But EVEN IF you believed Apple no different than the rest if you're in the Apple ecosystem they are so vertically integrated that if they are stealing your data they are the ONLY ONE doing so. If you have a Samsung phone bought from a carrier then you have not just Samsung but also Google and the carrier in a position to compromise your privacy/personal data. If you have a Windows PC that has kernel level drivers for your CPU, your motherboard, your GPU, maybe even your SSD then you have a whole host of companies in a position to do so.

Now sure you can argue, just because I'm running Nvidia drivers that have their hooks deep into the Windows kernel doesn't mean they are going to steal my data. Hopefully not, if for no other reason than the consequences for their brand would take a huge hit if they were caught doing that for questionable potential gains. But the same reasoning applies to Apple, if they were caught doing that it would be a massive black eye for their brand and their marketing on privacy that would vastly outweigh any minor increase in advertising revenue the added data might enable.
 
Good stuff. Hopefully they implement it in iPadOS and macOS soon. Was hoping they would with the M5 since shared architecture, but I’m not aware they have.
 
Isn't this available on latest macOS?
I would’ve thought so, but I haven’t seen it specifically mentioned outside of iOS 26 and the iPhone 17 series.

I would be glad to be wrong though.

To me, it would make sense for all the M5 devices to have it enabled since it is the A19. And it would be kind of pointless to have those protections in only one device and not all.
 
I would’ve thought so, but I haven’t seen it specifically mentioned outside of iOS 26 and the iPhone 17 series.

I would be glad to be wrong though.

To me, it would make sense for all the M5 devices to have it enabled since it is the A19. And it would be kind of pointless to have those protections in only one device and not all.
 
Hmm.. interesting, well that is helpful thank you. I guess then by the transitive property we can deduce that the M5 in the iPad Pro also has MIE/MTE.
It’s interesting to see Cellebrite behind an Apple employee presentation. I haven’t followed it over recent years. Cellebrite operations seemed quite adversarial years ago. Has that changed?
I noticed that too, which I found quite amusing. I’m not sure there’s a correlation there, but I don’t have any evidence for that.
 
It’s interesting to see Cellebrite behind an Apple employee presentation. I haven’t followed it over recent years. Cellebrite operations seemed quite adversarial years ago. Has that changed?

They are a sponsor of that conference. It is hilarious that Apple is there presenting technology that delivers a MASSIVE blow to their ability to offer their "services" to law enforcement trying to break into iPhones equipped with MIE! 😂
 
Meanwhile...

Seems that the MacBook with iPhone processor has more chances to happen. The issue of course is the GPU, which is on the tier of the nVIDIA GTX 1050 Notebook. Also the price could be a factor if there is the M1 Macbooks on that price with better GPUs on there
 
Meanwhile...

Seems that the MacBook with iPhone processor has more chances to happen. The issue of course is the GPU, which is on the tier of the nVIDIA GTX 1050 Notebook. Also the price could be a factor if there is the M1 Macbooks on that price with better GPUs on there

What difference does its "weaker" GPU make? It is massively overpowered for handling a GUI, and is able to handle plenty of games on iOS and would be able to run all the same games just as well (better, in fact, since the laptop form factor is better suited to radiative cooling so it will throttle less)

Could it run the kind of games hardcore gamers here want to run? No of course not, but people won't be buying this to for hardcore gaming. People don't buy Macs in general for hardcore gaming, and most of them aren't available on macOS anyway.
 
Meanwhile...

Seems that the MacBook with iPhone processor has more chances to happen. The issue of course is the GPU, which is on the tier of the nVIDIA GTX 1050 Notebook. Also the price could be a factor if there is the M1 Macbooks on that price with better GPUs on there


The X Plus has an even worse GPU, so what? And it is not clear whether it will really be the A18 Pro or possibly the A19 Pro (the version with a 5-core GPU like in the iPhone Air). The A18 Pro was already comparable to the M1 in terms of raw GPU performance. And the presence of RT and dynamic caching takes it to a level above the M1.
 
The X Plus has an even worse GPU, so what? And it is not clear whether it will really be the A18 Pro or possibly the A19 Pro (the version with a 5-core GPU like in the iPhone Air). The A18 Pro was already comparable to the M1 in terms of raw GPU performance. And the presence of RT and dynamic caching takes it to a level above the M1.
A19 Pro is the best scenario, it has the performance of the GTX 1650 GPU, that is better.
 
As long as the A18 Pro can handle the macOS GUI and an external 4K monitor without lag, nobody truly looking at this MacBook model will care much about GPU performance.

My wife has an M4 MacBook Air, and it is way, way overpowered for what she does, which is surf, stream, email, light office applications, and look up recipes, etc. However, I paid edu pricing and got free AirPods in Apple’s yearly edu promotion, so no complaints about the pricing.
 
As long as the A18 Pro can handle the macOS GUI and an external 4K monitor without lag, nobody truly looking at this MacBook model will care much about GPU performance.

My wife has an M4 MacBook Air, and it is way, way overpowered for what she does, which is surf, stream, email, light office applications, and look up recipes, etc. However, I paid edu pricing and got free AirPods in Apple’s yearly edu promotion, so no complaints about the pricing.
I don’t see why it couldn’t? It’s single-thread performance isn’t much below the M4, and it’s multi-thread is M1 level. Should be plenty enough for those activities.
 
I don’t see why it couldn’t? It’s single-thread performance isn’t much below the M4, and it’s multi-thread is M1 level. Should be plenty enough for those activities.
That's basically my point: CPU performance is actually great (amazing ST and good MT), and those complaining about its GPU performance of an A18 Pro in an entry level MacBook are barking up the wrong tree, as Metal speed of A18 Pro is in roughly the same performance class as M1.

The reason I bought an M4 over M1 and M2 for my wife's MacBook Air was just because of future OS support (both M1 and M2 are old models, 2020 and 2022 respectively) and because of the form factor (M1 is the previous form factor). It had absolutely nothing to do with CPU or GPU performance. Also, although unnecessary for my wife's usage, the fact that M3 and M4 have hardware support for AV1 decode was a bonus, and A18 Pro also supports hardware AV1 decode. AI is also supported.

The only things I really wonder about are if the entry level A18 Pro MacBook would support USB 4 / Thunderbolt or USB 3, what base memory it would ship with, and battery life. USB 4 / Thunderbolt would be preferred, and 12 GB RAM may be the sweet spot at US$799 retail MSRP (and cheaper on sale or with education pricing), although 16 GB would be even better. I also wonder if this model would be eligible for the yearly education promotion. If so, going forward I will likely buy the A-series MacBooks over the MacBook Air, because an edu-discounted one with promotional free gift / gift card each spring would be a steal. Even if battery life took a bit of a hit, it would likely still be good given that battery life on the current MacBook Airs is very good.
 
That's basically my point: CPU performance is actually great (amazing ST and good MT), and those complaining about its GPU performance of an A18 Pro in an entry level MacBook are barking up the wrong tree, as Metal speed of A18 Pro is in roughly the same performance class as M1.

The reason I bought an M4 over M1 and M2 for my wife's MacBook Air was just because of future OS support (both M1 and M2 are old models, 2020 and 2022 respectively) and because of the form factor (M1 is the previous form factor). It had absolutely nothing to do with CPU or GPU performance. Also, although unnecessary for my wife's usage, the fact that M3 and M4 have hardware support for AV1 decode was a bonus, and A18 Pro also supports hardware AV1 decode. AI is also supported.

The only things I really wonder about are if the entry level A18 Pro MacBook would support USB 4 / Thunderbolt or USB 3, what base memory it would ship with, and battery life. USB 4 / Thunderbolt would be preferred, and 12 GB RAM may be the sweet spot at US$799 retail MSRP (and cheaper on sale or with education pricing), although 16 GB would be even better. I also wonder if this model would be eligible for the yearly education promotion. If so, going forward I will likely buy the A-series MacBooks over the MacBook Air, because an edu-discounted one with promotional free gift / gift card each spring would be a steal. Even if battery life took a bit of a hit, it would likely still be good given that battery life on the current MacBook Airs is very good.

I think the simplest guess is that it will be exactly the specs of the SoC we see in the phones. So USB 3 not USB 4 or TB. Memory will be 8 GB if it is A18P, 12 GB if it is A19P. No chance of 16 GB, because stuff like that is how Apple segments products rather than by having lower clock rates like PC OEMs. They used 8 GB as the base for a while for Apple Silicon Macs, so we know that's just fine for the target market (people with usage models like your wife)

I think it will be priced less than $799. No higher than $699, and I'd say there's a 50/50 shot it is below that. If they want to meaningfully increase the unit sales of the Mac, they need a real difference in price. I've said before why I believe the BOM of a Macbook isn't all that different than the BOM of a phone, and they sell the 16e for $599. They could make the Macbook work at that price if they really wanted to, though I think that's what they see as the "bottom" price, special edu discounts, closeouts when they rev it to a new SoC, and its normal price will be more like $649 or $699. To get that you don't get more RAM, TB support and other "nice to haves" that the target market doesn't need or care about.
 
Back
Top