Discussion Apple Silicon SoC thread

Page 156 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,583
996
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:

 
Last edited:
Jul 27, 2020
15,742
9,809
106
How hard would it be for them to come out with their own dGPUs that work in tandem with their iGPUs? By locking out other vendors, they ensure maximum profits for themselves.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
It seems odd that it won't have dGPU support. I get that their iGPUs are plenty powerful, but, that seems short sided as there are some very powerful pro series cards with specific purposes. However, I can see the other uses for PCIe slots, such as additional NVME capacity, using CXL for RAM expansion, and additional I/O capabilities.

They might still have a specialty GPU card in the desktop Mac Pro in the future, but it would be a special compute accelerator, not a general purpose GPU, and it would probably be an Apple Silicon based GPU.
 

Eug

Lifer
Mar 11, 2000
23,583
996
126
They might still have a specialty GPU card in the desktop Mac Pro in the future, but it would be a special compute accelerator, not a general purpose GPU, and it would probably be an Apple Silicon based GPU.
I wouldn't be surprised if that and/or another Apple-branded PCIe card gets announced at the same time as the Mac Pro (although not necessarily released immediately). That would make sense from a marketing standpoint anyway.

BTW, there is some debate as to the specs of the new Mac Pro SoC.

The original rumour is 40 CPU cores, including 32 performance and 8 efficiency, from the claimed specs of the dev box. However, other subsequent supply chain rumours suggest 12 CPU cores for M2 Max, which would imply up to 48 CPU cores. I'm not sure what those extra 8 CPU cores would be, performance or efficiency, but I'm guessing the former if true. Maybe it's both, with the 40 as a binned variant of 48?

Graphics cores have been rumoured to be up to 152, which implies 160 with binning.

Memory has been suggested to be either 256 GB or 384 GB maximum. The latter would require 12 GB DRAM chips. If limited to 8 GB DRAM chips, it would be 256 GB.
 
  • Like
Reactions: Tlh97 and scineram

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
When they released Mac Studio, Apple actually made a point of saying the Mac Pro is still coming (which is unusual for Apple), although they also conspicuously refrained from mentioning the 2 year time frame this time around.

Cool, thanks for all that for either I missed it or I forgot. Thank you for your passion and expertise Eug 🙂
 
  • Like
Reactions: Tlh97 and Eug

Doug S

Platinum Member
Feb 8, 2020
2,201
3,405
136
They might still have a specialty GPU card in the desktop Mac Pro in the future, but it would be a special compute accelerator, not a general purpose GPU, and it would probably be an Apple Silicon based GPU.


How would you connect it? I highly doubt the Mac Pro has x16 or even x8 slots, when multiple x4 slots is all anyone needs when you don't have dGPUs. I've said all along it would probably have like 4 x4 slots and that's it, to connect high speed fiber networking/storage cards. A compute accelerator has the same issues that if you want to get data in/out quickly you want unified memory, if you don't (it has vast local memory) then an x4 connection should be just fine.

Apple's direct sharing of the entire memory hierarchy between the CPUs, GPU, NPU, etc. has some advantages, but one of the big disadvantages is that a discrete GPU can't participate in that. Well, it could I suppose in the sense that you can map PCIe memory into a place where it is accessible by the CPU, but access speeds and latencies would be terrible compared to iGPU memory. Applications would all assume fast access (including remote snooping into other domains) so that dGPU might perform worse than the iGPU even if on paper it was much faster. Devs would ignore this sort of halo market of a few thousand people who bought a dGPU for their Mac Pro unless they have a very niche app that was only marketed to them and cost so much it would be worth it - so the only people who would want a dGPU are those who would buy this niche software. Sort of a chicken and an egg situation.

The other problem is that in order to create a dGPU faster than the iGPU Apple would have to create a new die - a 'Max' die that cuts out the CPUs, NPU, etc. and fills all that space with more GPU cores. The cost of the mask set versus the tiny addressable market size says that's not even remotely a viable option. OK, you say, then Apple should support third party cards so people can plug an Nvidia card in there - assuming Nvidia cares to write and properly maintain a driver for it, when the only market is Mac Pro owners which is not a large market! Same issues as above, PLUS it uses a different rendering model so it would be further handicapped when running software written for the Mac (though I suppose helped by poorly ported software that expects the PC model of rendering and handicaps GPUs with Apple's deferred rendering...which seems to be quite a lot of it unfortunately)
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
CXL over PCIe 5.0 X16 might be enough to make an external card work well enough if Apple demands full memory coherence. I don't think Apple is putting so much in the way of resources into that. That's a pretty big stretch over where they are now. If they don't care about memory coherence, they could always use any of several programming tricks to make a dGPU work, but, that would take a willingness to accept a variation from their base program model. I wouldn't be shocked if it just was a set of X4 4.0 slots, but, I think that, for what some Mac Pros are used for, I would have to believe that they have at least one x16 4.0 or 5.0 card slot that can support a PCIe card that can host 4 X M.2 memory cards to give the needed storage throughput that such a beast of an iGPU would demand.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Apple's direct sharing of the entire memory hierarchy between the CPUs, GPU, NPU, etc. has some advantages, but one of the big disadvantages is that a discrete GPU can't participate in that. Well, it could I suppose in the sense that you can map PCIe memory into a place where it is accessible by the CPU, but access speeds and latencies would be terrible compared to iGPU memory. Applications would all assume fast access (including remote snooping into other domains) so that dGPU might perform worse than the iGPU even if on paper it was much faster. Devs would ignore this sort of halo market of a few thousand people who bought a dGPU for their Mac Pro unless they have a very niche app that was only marketed to them and cost so much it would be worth it - so the only people who would want a dGPU are those who would buy this niche software. Sort of a chicken and an egg situation.

This isn't a generic dGPU, it's a specific compute resource, where you basically batch off some kind of big parallel compute job to run on the compute accelerator card. It will have it's own low latency memory, so doesn't need super low latency access to system memory.

The other problem is that in order to create a dGPU faster than the iGPU Apple would have to create a new die - a 'Max' die that cuts out the CPUs, NPU, etc. and fills all that space with more GPU cores. The cost of the mask set versus the tiny addressable market size says that's not even remotely a viable option.

Chiplets, and they remain viable for MANY years because you really aren't competing with anything here. It's niche product, that you might build using multiples of the same chiplet for 5 years. So one mask in 5 years isn't a big issue.

Not saying this will happen. Just saying it could happen, if Apple feels the need for more GPU compute than they get in the SoC approach.
 

MadRat

Lifer
Oct 14, 1999
11,909
229
106
If memory access to the GPU was critical then wouldn't you just place your first memory block for the PCI/GPU. Anything after that would be operating PCI resources/memory for the rest of the system. Sounds an awful lot like how C-64 was configured, only on a much larger scale and using a standard system interface instead of a propriety.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Yeah 1P/3E would make the chip quite weak compared to its predecessor. It’s still surprising that Apple is binning like this. The E-cluster occupies a very small area; how many chips do they really have with defective 4th E core??

It's probably not outright defective, but a case where there's a one or more leaky e-cores that isn't stable at the voltages Apple wants to use for the clock speeds they're targeting.

Either those can go to a desktop part where the voltages can be relaxed a little bit or if it's especially bad or an actually defective part, they can just make a catch-all bin that handles a minimum specification that most otherwise unfit chips can fall into.
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
I'm curious, where are you reading N3E is too late for the 2023 iPhones? FWIW, Nikkei disagrees with you.

TSMC said in their earnings call (which was Oct 13th) that N3 would be "mid single digits" of revenue in 2023. They also stuck to the +1 year for N3E but also said they might pull it in a quarter or two... but even that would be too late for the 2023 iPhone.
 
  • Like
Reactions: Tlh97 and scineram

Eug

Lifer
Mar 11, 2000
23,583
996
126
Yeah 1P/3E would make the chip quite weak compared to its predecessor. It’s still surprising that Apple is binning like this. The E-cluster occupies a very small area; how many chips do they really have with defective 4th E core??
I just ordered the 2021 Apple TV, but I'm wondering if I should have just spent the extra fifty bucks to get the 2022, mainly because of the upgraded SoC (A12 --> A15) and extra RAM (3 GB --> 4 GB). (Apple removed Ethernet support in their 64 GB 2022 model, so to get that you have to jump up to the 128 GB model, which is complete overkill for someone like me who doesn't play games on Apple TV.)

I'm using a 2017 model, and the main reason for the upgrade isn't actually the SoC. It's mainly the remote, because the 2017's remote just sucks. The 2017 uses A10X and has 3 GB RAM.

I don't really know the comparative single-core speeds since I'm not sure if they run the same clock speeds as in the phones and iPads. Also, the 2017 and 2021 have a fan but the 2022 is fanless in a smaller case.
 

Doug S

Platinum Member
Feb 8, 2020
2,201
3,405
136
I just ordered the 2021 Apple TV, but I'm wondering if I should have just spent the extra fifty bucks to get the 2022, mainly because of the upgraded SoC (A12 --> A15) and extra RAM (3 GB --> 4 GB). (Apple removed Ethernet support in their 64 GB 2022 model, so to get that you have to jump up to the 128 GB model, which is complete overkill for someone like me who doesn't play games on Apple TV.)

I'm using a 2017 model, and the main reason for the upgrade isn't actually the SoC. It's mainly the remote, because the 2017's remote just sucks. The 2017 uses A10X and has 3 GB RAM.

I don't really know the comparative single-core speeds since I'm not sure if they run the same clock speeds as in the phones and iPads. Also, the 2017 and 2021 have a fan but the 2022 is fanless in a smaller case.


I have the 2017 I got after the 2021 was released on a $99 special. I can't really think of any reason I would need/want to upgrade to a newer model. It comes as news to me that mine has a fan - I have never heard it run! I did get the newer remote, I agree it is better.

Why does ethernet support matter, even if you have to go through several walls you don't need more than 20 Mbps or so for 4K streaming. I used to use ethernet more extensively at home but the only thing I have connected via ethernet now is my PC. Everything else is wireless, even stuff like my Apple TV and Tivo that could be wired and have a jack right next to them. I suppose if you live in a dense apartment building type situation wifi could be near useless with all the APs surrounding you.

I'm sure Apple has data on how many Apple TV owners use ethernet vs wifi and decided ethernet support isn't something many people are using. Those who need/want it should be glad they can still get it, most set tops are wifi only these days.

Maybe what they could have done if they were looking to reduce cost on it is remove the power supply and replace the power connector with a USB-C port. Use that to power it, and have the tVOS software support a hub as well as a USB to ethernet adapter. Those drivers are already in macOS so it isn't like that support would cost them anything.
 
  • Like
Reactions: Tlh97 and Eug

Eug

Lifer
Mar 11, 2000
23,583
996
126
Why does ethernet support matter, even if you have to go through several walls you don't need more than 20 Mbps or so for 4K streaming. I used to use ethernet more extensively at home but the only thing I have connected via ethernet now is my PC. Everything else is wireless, even stuff like my Apple TV and Tivo that could be wired and have a jack right next to them. I suppose if you live in a dense apartment building type situation wifi could be near useless with all the APs surrounding you.
That is not correct. Apple TV+ averages at close to 30 Mbps, and peaks at over 40 Mbps. Below says 41 Mbps, but I've seen posts that state up to around 45 Mbps with some content like their show See.


see_bitrate.jpg

I'm in a house with a decent pseudo-mesh* all-Apple AirPort network. (*Not a real mesh, but behaves sort of like one with seamless handoff, though only for Apple devices.) Nonetheless, I still prefer Ethernet, just because. Also, not that I use it at the moment, but only the Ethernet-endowed units support Thread.

BTW, I'll use this as an excuse to post my overkill home setup. :p

Screen Shot 2022-11-20 at 4.41.00 PM.png

"Gaz" is dead because it's actually outside and unfortunately something chewed through the outdoor Ethernet cabling. :mad:

---

Anyhow, I've been reading some more reviews and it seems a few reviewers say A15 offers a bit of oomph to Apple TV 4K menu navigation over A12, but others say it's kind of hard to notice. The biggest difference is with app loading speed, and gaming performance. Apps load a few seconds faster, and games perform MUCH better. However, I don't play any games at all on my Apple TV so that's moot for me. So, maybe I'm just better off saving the 50 bucks with A12 instead of A15. My A10X unit with the crappy remote will be demoted to a secondary TV.
 
  • Like
Reactions: moinmoin

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
I just ordered the 2021 Apple TV, but I'm wondering if I should have just spent the extra fifty bucks to get the 2022, mainly because of the upgraded SoC (A12 --> A15) and extra RAM (3 GB --> 4 GB).

I'm not sure what the extra CPU power would get you. It's going to use a dedicated video decoder for actually playing videos.

The only thing the extra CPU power and RAM might do is make apps and switching between them more fluid.
 

Muadib

Lifer
May 30, 2000
17,914
838
126
I just ordered the 2021 Apple TV, but I'm wondering if I should have just spent the extra fifty bucks to get the 2022, mainly because of the upgraded SoC (A12 --> A15) and extra RAM (3 GB --> 4 GB). (Apple removed Ethernet support in their 64 GB 2022 model, so to get that you have to jump up to the 128 GB model, which is complete overkill for someone like me who doesn't play games on Apple TV.)

I'm using a 2017 model, and the main reason for the upgrade isn't actually the SoC. It's mainly the remote, because the 2017's remote just sucks. The 2017 uses A10X and has 3 GB RAM.

I don't really know the comparative single-core speeds since I'm not sure if they run the same clock speeds as in the phones and iPads. Also, the 2017 and 2021 have a fan but the 2022 is fanless in a smaller case.
Then you just need a remote.

 
  • Like
Reactions: Eug

Eug

Lifer
Mar 11, 2000
23,583
996
126
Then you just need a remote.

Heh. Thanks. I'm going to use the old Apple TV on a different TV.
 
  • Like
Reactions: Muadib

Doug S

Platinum Member
Feb 8, 2020
2,201
3,405
136
May help him if Apple decides to launch a newer tvOS that is harder on the older CPUs. Plus, he gets updates for a few more years.


tvOS is basically iOS with some stuff removed, in the same way iPadOS is iOS with some stuff added, and iOS was created by cutting down macOS nee OS X. Like Windows, iOS has gone through its "every release requires more resources than the last" period and like Windows has largely stabilized its performance/memory needs. There's really nothing to slow down anyway, the CPU is doing so little unless you're gaming that they could have put in a single little core and people using it for video only would never know.

What you say is true about the updates, but I think they matter far far less for a set top than a phone. A phone contains some of my most personal information, and it is constantly taking in information from the outside world that's largely uncontrolled. From getting messages/SMS that might trigger a bug to browsing web sites, to running a number of apps. A set top runs a handful of apps, and once the app is being fed video that stream is handed over to a decoder and bypasses the CPU completely. Hacking a set top would be a lot harder, but also a lot less valuable to an attacker. What's someone going to do to me if they p0wn my Apple TV, make it rick roll me when I'm trying to watch a movie?

I would never consider using a smartphone that wasn't being actively updated, but if Apple drops updates from 2017 Apple TV in a few years it won't make me buy a new one. Maybe a few years after that when the Disney or Amazon app won't update because it needs a newer OS and without the update the app won't work but by that time I'll have got the better part of a decade out of it.
 
  • Like
Reactions: Eug and scannall

ashFTW

Senior member
Sep 21, 2020
303
225
96
I just got a 2022/128GB Apple TV 4K; I have a few 2021 models, and one 2017 as well. I don‘t see much difference (perhaps it’s a bit snappier and may be able to download more Aerial screensavers) in user experience compared to the 2021 model it replaced. It’s definitely smaller and much lighter. And I’m using Wi-Fi with all but one without issues. I have no need for HDR10+ support as none of my TVs support it, but I’m looking forward to trying out Thread support next year.
 
  • Like
Reactions: Eug

Eug

Lifer
Mar 11, 2000
23,583
996
126
This is just PR for TSMC and Apple. Sure they will still be using some N4/N5 stuff in 2024 - if nothing else for the 2022 Apple TV which will likely still be sold then - but 90% of their needs will have moved to N3. And by the time the N3 fab that's recently been talked about comes online (assuming it happens) 90% of Apple's needs will have moved to N2.

TSMC has been clear that their state of the art production will remain exclusive to Taiwan, and that's Apple's bread and butter. The only way Apple sources a meaningful portion of its chip needs from the US is if they someday switch to Intel's foundry.
Yeah, Morris Chang, who founded TSMC, reiterated today that Arizona would be N5 in phase 1 and then N3 in phase 2, although the plans for phase 2 have not yet been finalized.



I just got a 2022/128GB Apple TV 4K; I have a few 2021 models, and one 2017 as well. I don‘t see much difference (perhaps it’s a bit snappier and may be able to download more Aerial screensavers) in user experience compared to the 2021 model it replaced. It’s definitely smaller and much lighter. And I’m using Wi-Fi with all but one without issues. I have no need for HDR10+ support as none of my TVs support it, but I’m looking forward to trying out Thread support next year.
Thanks, that's helpful. That almost makes we want to take @Muadib's advice and just get a remote for my 2017 unit. But I should probably get more storage on that thing. It's 32 GB and runs out of space because of the screensavers. It doesn't affect app installation though, since it just deletes extra screensavers when I need more space. How much space did you have in your 2021 and did you have a ton of apps?

I take it you don't game at all on the Apple TV. Neither do I but then again I have young kids who are now starting to like light games. Mind you, those games (eg. Minecraft, Roblox) aren't even available on Apple TV.

BTW, I did some testing yesterday, and what I had thought were performance related issues on the 2017 got fixed some time recently. For example, with some high bitrate HEVC files, Infuse used to take a long time to stabilize the video image after skipping around the file. On certain titles, after skipping forward I'd sometimes get severe pixelation and colour anomalies for a few seconds before the image finally cleared itself. It was almost as if my 2017 was struggling to keep up. That is gone now. One the flip side, one specific video now doesn't play at all, whereas it played fine 2 months ago. Hmmm... So clearly something has changed, either in the OS or in Infuse, or both.

App loads still aren't fast though. CNN claims that menu speed and app loading is much faster on the 2022:


CNN said:
I’ve been using the 2021 version of the Apple TV 4K as my main streaming device since it was released, and after going through the initial setup process on the new Apple TV 4K, I immediately noticed a difference in how fast the apps loaded and navigation felt. Across the board, there was a noticeable difference.

One area where the speed boost was apparent was when I used Siri to request information, search the App Store or start a binge-watching session. Siri not only loaded faster, but the responses were almost instantaneous. Granted, part of the reason Siri loads faster is due to the new Siri interface that looks more like Siri on the iPad, taking up only a corner of the screen instead of taking over the entire TV.

Another area where the performance just felt faster was how quickly the Apple TV app would open and populate with suggestions of what to watch next.

This variation in reports as to OS and app loading speed reminds me of what people talk about when an iPhone is several years old but getting the latest OS. Most people think the interface speed is totally OK but others complain it's become more laggy. This reminds me of my wife's recent desire to upgrade from her iPhone XR, which coincidentally also has an A12 and 3 GB RAM under the hood. Her main reason to upgrade was to get the new-fangled camera actually, not SoC performance. She didn't have any specific problems with the XR's OS navigation speed, but once she got her iPhone 14 Pro Max with A16 and 6 GB RAM, she said it was obviously noticeably faster. (She's not a tech geek and still uses an iPad 7 with A10 and 3 GB RAM with no complaints.) I played with both phones, and yes the A16 Pro Max was indeed noticeably faster even just for navigation, but then again there was nothing wrong with the A12 XR. It was perfectly fine for regular use and surfing. I could use a XR full time in terms of speed, although I wouldn't like its mediocre and single-lens camera, and Safari probably refreshes more than I'd like vs. 6 GB iPhones.
 
  • Like
Reactions: ashFTW

smalM

Member
Sep 9, 2019
54
54
91
TSMC said in their earnings call (which was Oct 13th) that N3 would be "mid single digits" of revenue in 2023. They also stuck to the +1 year for N3E but also said they might pull it in a quarter or two... but even that would be too late for the 2023 iPhone.
"Actually, it's not at the same time. Right now, we are ramping up by N3. And N3E is supposed to be 1 year apart, but because of the progress
so well, so we might pull in a little bit for 2 or 3 months, that's all." -- C.C Wei, Q3 Earnings Conference