Discussion Apple Silicon SoC thread

Page 331 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,881
1,455
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

ikjadoon

Senior member
Sep 4, 2006
235
513
146

To clarify, that is an A18 Pro score: D94AP → iPhone 16 Pro Max.

Dx7AP = Base
Dx8AP = Plus
Dx3AP = Pro
Dx4AP = Pro Max

D64AP - iPhone 13 Pro Max
D74AP - iPhone 14 Pro Max
D84AP - iPhone 15 Pro Max
D94AP - iPhone 16 Pro Max

For A18 scores, the motherboard should be D48AP (iPhone 16 Plus) or D47AP (iPhone 16).

Highest A18, 3420 / 8370: https://browser.geekbench.com/v6/cpu/7714502
Highest A18 Pro, 3409 / 8492: https://browser.geekbench.com/v6/cpu/7714134
 

mvprod123

Member
Jun 22, 2024
148
151
76
So we have proof that A18 has a new GPU uarch. The regular A18 with 5-core GPU matches the raw performance of the A17 Pro with 6-core GPU. A18 Pro 6-core gpu is faster than M1 8-core GPU. Both use LPDDR5X-7467, which is confirmed by Apple's claim of a 17% improvement in memory bandwidth. I'm curious about what changes Apple has made and whether FP32 has improved performance.
 

FlameTail

Diamond Member
Dec 15, 2021
4,127
2,499
106
So we have proof that A18 has a new GPU uarch. The regular A18 with 5-core GPU matches the raw performance of the A17 Pro with 6-core GPU. A18 Pro 6-core gpu is faster than M1 8-core GPU. Both use LPDDR5X-7467, which is confirmed by Apple's claim of a 17% improvement in memory bandwidth. I'm curious about what changes Apple has made and whether FP32 has improved performance.
So this means M4 Pro and M4 Max will also have this new GPU uarch.
 

SpudLobby

Senior member
May 18, 2022
991
684
106
Read those two paragraphs side by side...

The Apple modem will be "comparable" to the QC modem. It will probably be superior along the dimensions Apple cares about (most importantly energy) and possibly slightly inferior in terms of peak bps.

Either way IT WILL NOT MATTER for anyone except the dick-measuring brigade, for precisely the reasons encapsulated in Eug's post. People are happy to insist that some network metric is the most important thing in the universe -- but revealed preference (when it's your own money you have to spend) shows that most people are perfectly happy with what they have right now...
I honestly won’t be surprised if it’s significantly worse in a technical sense, see: band compatibility, power, reception in poor coverage


IOW: would not expect it to be better but don’t expect it to be as bad as Intel modems in say 2017-2019, which were really inferior to Qualcomm’s stuff in iPhones, notably so. Or similarly the X55 in the iPhone 12 was much worse than the X60 in the 13, both on feature sets with 5G and power or reception (all due to both the node and the new bands or aggregation stuff).

Even then, it didn’t break the iPhone, and where we are in 5G I strongly suspect it won’t be any kind of clusterfuck like the Intel modems or the X55.

Nonetheless I think comparing Qualcomm RF to Intel’s CPUs in Apple’s vertical integration attempts is very funny and will age poorly even if Apple modems end up fine.
 

SpudLobby

Senior member
May 18, 2022
991
684
106
So despite the rumors it is the same A18 die. There would be more differences than just one GPU core if they were doing two dies.
Yeah what’s also interesting then is it means the A18 Pro features they listed off besides the gpu core are true for the A18 too, but they didn’t bother mentioning that.

That said, they claimed the A18 Pro has larger caches when mentioning the CPU specifically, I wonder if this again is true for the A18 as well and it’s just bad communication?
 

jdubs03

Golden Member
Oct 1, 2013
1,022
683
136
I honestly won’t be surprised if it’s significantly worse in a technical sense, see: band compatibility, power, reception in poor coverage


IOW: would not expect it to be better but don’t expect it to be as bad as Intel modems in say 2017-2019, which were really inferior to Qualcomm’s stuff in iPhones, notably so. Or similarly the X55 in the iPhone 12 was much worse than the X60 in the 13, both on feature sets with 5G and power or reception (all due to both the node and the new bands or aggregation stuff).

Even then, it didn’t break the iPhone, and where we are in 5G I strongly suspect it won’t be any kind of clusterfuck like the Intel modems or the X55.

Nonetheless I think comparing Qualcomm RF to Intel’s CPUs in Apple’s vertical integration attempts is very funny and will age poorly even if Apple modems end up fine.
They know what the X75 can do. The X80 comes out in a few months, which will be in next years devices (rumor is just the Pros). So that really is their target. I doubt they’d want to release something that is over a generation behind the latest component they put in their flagship phone (and would be two generations behind in only a few months).
Ideally, they wouldn’t release something until it could be on the same level.
 

The Hardcard

Senior member
Oct 19, 2021
255
337
106
They know what the X75 can do. The X80 comes out in a few months, which will be in next years devices (rumor is just the Pros). So that really is their target. I doubt they’d want to release something that is over a generation behind the latest component they put in their flagship phone (and would be two generations behind in only a few months).
Ideally, they wouldn’t release something until it could be on the same level.
They can only do want they can. The issue is patents. When Qualcomm started working on CDMA, most of the rest of the industry thought it was physically impossible, or at least not worth the effort. Qualcomm scored big time on that. it is almost impossible to have a modem targeting the US market that is both able to avoid Qualcomm patents and not be inferior in capability and/or power consumption.

You have to really want to not use Qualcomm‘s products to try and find another path since a company needs tremendous capital to dump on a expensive project that at best won’t pay off for many years now.
 

FlameTail

Diamond Member
Dec 15, 2021
4,127
2,499
106
They can only do want they can. The issue is patents. When Qualcomm started working on CDMA, most of the rest of the industry thought it was physically impossible, or at least not worth the effort. Qualcomm scored big time on that. it is almost impossible to have a modem targeting the US market that is both able to avoid Qualcomm patents and not be inferior in capability and/or power consumption.

You have to really want to not use Qualcomm‘s products to try and find another path since a company needs tremendous capital to dump on a expensive project that at best won’t pay off for many years now.
There are many phones in the US (especially low end ones), which use non-Qualcomm modems such as Samsung or Mediatek.
 

Doug S

Platinum Member
Feb 8, 2020
2,846
4,840
136
Yeah what’s also interesting then is it means the A18 Pro features they listed off besides the gpu core are true for the A18 too, but they didn’t bother mentioning that.

That said, they claimed the A18 Pro has larger caches when mentioning the CPU specifically, I wonder if this again is true for the A18 as well and it’s just bad communication?

Hard to believe they'd do a custom die just to mess with the cache size. How could it be worth a second die just to save a couple mm^2? Whatever they may have done it doesn't appear to show up in GB 6.3 results.

Hypothetically if they wanted to bin on P core shared L2 or on SLC, if they added some fuses they could blow to chop it down by 25% they could bin on that as well as the GPU core. I mean, that's more chip area to bin on but they're still likely to be cutting out working cache and GPU core on 90%+ of the non-Pros whatever they're binning on.

I've always wondered if they will bin on power between the Pro and non Pro lines. Even at Apple's fixed frequency there will be dies that run at a lower voltage. They seem to use a fixed voltage along with the fixed frequency - at least I think there's good evidence they have done this so far - but if they wanted to make the Pro line "better" to help justify the additional cost they could give it the SoCs that operate at lower power. Makes sense given that it has to power an extra GPU core (and maybe a little extra cache?)
 
  • Like
Reactions: SpudLobby
Jul 27, 2020
20,509
14,208
146
Enjoying a bit of drama unfolding in a Teams meeting with participants (including directors and CEOs) from all over the country and this one dude who was supposed to do a presentation about his properties company is using a frickin' iPad and he doesn't know how to use it or its limitations while using Teams. So after wasting about 10 minutes for him to get his act together, he is now going to share the presentation to someone with an actual computing device so he can voiceover. Such amazing and "free" publicity for the iPad.

I bet the presentation would've gone better if he had let his 5 year old kid help :D
 

poke01

Platinum Member
Mar 8, 2022
2,390
3,155
106
Enjoying a bit of drama unfolding in a Teams meeting with participants (including directors and CEOs) from all over the country and this one dude who was supposed to do a presentation about his properties company is using a frickin' iPad and he doesn't know how to use it or its limitations while using Teams. So after wasting about 10 minutes for him to get his act together, he is now going to share the presentation to someone with an actual computing device so he can voiceover. Such amazing and "free" publicity for the iPad.

I bet the presentation would've gone better if he had let his 5 year old kid help :D
the iPad is a tablet, running a table os, that guy should got a surface pro or a macbook. Theres a reason why Apple still sells macbooks.

Must have been funny, lol