Discussion Apple Silicon SoC thread

Page 254 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

Nothingness

Diamond Member
Jul 3, 2013
3,089
2,083
136
Both although not a way to find out the actual pipeline length with software methods. People mostly use branch prediction penalties to estimate the pipeline depth but in recent days processors have heavily optimized the latency of the branch instructions(by passing for example); In addition, there is really no way to tell the backend length by looking at the numbers. Also getting an accurate branch penalty cycle is very difficult because even on a random pattern branch predictor can be correct too.
Oh yes. Branch predictors have become incredibly complex. I know several engineers working on bpred in my company; they are very bright and come with crazy ideas to remove every source of misprediction. There's still a lot of active work in that area and for good reasons: if you don't get instructions in time, you can't do anything at all with all your ALU and data prefetchers :)
 
  • Like
Reactions: igor_kavinski

junjie1475

Junior Member
Apr 9, 2024
17
53
51
but note some schedulers now dispatch to several units).
The reason why I put the dotted line in the scheduler blocks is that I don't have enough time to set up all the test patterns and figure out each scheduler’s entry count(only 2 days). But Maynard Handley pointed out that two schedulers that are side by side can share instructions in certain cases for better load balancing(this has been known since m1).
 

SpudLobby

Senior member
May 18, 2022
989
680
106
Performance uplift is something that needs further qualification. If you achieve an amazing 30% IPC uplift, but have a 20% clock regression it's nowhere near as impressive. The same is true if AMD has that 40% improvement, but it's only achieved by throwing so much power at the chip that even Intel would blush.

Apple could probably get a 40% uplift if they had a new, better design on a new, better node and they gave it a bigger power budget. Whether all of those stars align is another matter, but I wouldn't want to see them sacrifice on their power efficiency just to get to a 40% number when I'd be more than happy with a 15% gain if they achieved it with even less power than they use now.

Yep, same view. Or even 10% iso-power to me at this point since they’ve pushed power upwards too much (even if energy efficiency is still good, on phones those peaks are a bit too high).
 
  • Like
Reactions: Mopetar

FlameTail

Diamond Member
Dec 15, 2021
3,950
2,376
106
Bombshell Bloomberg report just dropped.
Apple M4 series to arrive in late 2024.


Gurman says that the entire Mac lineup is slated to get the M4 across late 2024 and early 2025
The iMac, low-end 14-inch MacBook Pro, high-end 14-inch MacBook Pro, 16-inch MacBook Pro, and Mac mini machines will be updated with M4 chips first, followed by the 13-inch and 15-inch MacBook Air models in spring 2025, the Mac Studio in mid-2025, and the Mac Pro later in 2025.
Congrats to those who speculated that M4 family will release ~12 months after M3 family. You were right.
Apple is said to be nearing production of the M4 processor, and it is expected to come in at least three main varieties. Chips are codenamed Donan for the low-end, Brava for the mid-tier, and Hidra for the top-end. The Donan chip will be used in the entry-level MacBook Pro, the ‌MacBook Air‌ machines, and the low-end ‌Mac mini‌, and the Brava chips will be used in the higher-end MacBook Pro and the higher-end ‌Mac mini‌.

The Hidra chip is designed for the ‌Mac Pro‌, which suggests it is an "Ultra" or "Extreme" tier chip. As for the ‌Mac Studio‌, Apple is testing versions with an unreleased M3-era chip and a variation of the M4 Brava processor that would presumably be higher tier than the M4 Pro and M4 Max "Brava"
So,
Donan = M4
Brava = M4 Pro / M4 Max (?)
Hidra = M4 Ultra / M4 Extreme (?)
M4 versions of the Mac desktops could support as much as 512GB Unified Memory, which would be a marked jump over the current 192GB limit
But what bus width? 1024 bit (M4 Ultra) or 2048 bit (M4 Extreme)?
 

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
Congrats to those who speculated that M4 family will release ~12 months after M3 family. You were right.
It's premature to come to that conclusion. Gurman often is correct with release speculation in terms of hardware, but lately he's been repeatedly off with his timing. For example, the OLED iPad Pros have been predicted for several months now and we still don't have them. His new claim is that the iPad Pros are delayed due to OLED manufacturing issues. That may or may not be true but the bottom line is that the timing predictions haven't panned out.

BTW, the other thing I wonder about is the Mac Studio. Are we expecting M3 Ultra this June then? And what about the Mac Pro? Then there's the Mac mini which was never updated with M3 Pro and M3, even though the chips have been out for 6 months. The Mac mini was last updated January of 2023, so I'm thinking the Mac mini may skip the M3 series entirely.

Anyhow, my gut feeling is that the chip cycle will vary, anywhere from 12-18 months or so, based on a number of factors, like how fast TSMC can move. As in there isn't going to rigid yearly cycle, at least not with all the chips (eg. Ultra), but there isn't going to be a rigid 18 month cycle either.
 

mikegg

Golden Member
Jan 30, 2010
1,844
467
136
Bombshell Bloomberg report just dropped.
Apple M4 series to arrive in late 2024.




Congrats to those who speculated that M4 family will release ~12 months after M3 family. You were right.

So,
Donan = M4
Brava = M4 Pro / M4 Max (?)
Hidra = M4 Ultra / M4 Extreme (?)

But what bus width? 1024 bit (M4 Ultra) or 2048 bit (M4 Extreme)?
I was the first to call the 1 year cadence. It was too obvious.

Unfortunately, many people still can't believe it.

2/3 M generations have closely followed the A series by 1 month. The lone year off was a crazy covid year. Suddenly, people think it's 18 months.

All Apple executive public interviews have hinted at yearly updates.

But nope. Not good enough for many posters here.
 

Doug S

Platinum Member
Feb 8, 2020
2,784
4,744
136
People are missing the most interesting detail here, the top memory config of 512 GB.

If that's true it implies several things.

1) Apple is switching from 12Gb DRAMs to 16Gb DRAMs, since you can't make 512 GB with 12Gb parts
2) If they are going to 16Gb DRAMs that likely means LPDDR5X
3) The step up from 192 GB to 512 GB means either they are using denser packages (so the max memory possible for all models goes up) or M4 finally brings the fabled "Extreme"

If they are going LPDDR5X with Apple Silicon I'll bet that's one of the differences between A18 and A18 Pro, with the regular A18 sticking with LPDDR5 for cost reasons. Which would mean for the very first time that the Pro iPhones will be faster. Probably not a whole lot, but measurable at least in tasks requiring a lot of memory bandwidth (unless there is also a clock rate difference, cache size difference, etc. as well)
 

FlameTail

Diamond Member
Dec 15, 2021
3,950
2,376
106
1) Apple is switching from 12Gb DRAMs to 16Gb DRAMs, since you can't make 512 GB with 12Gb parts
Doesn't M3 Max already do that? It has 128 GB Max RAM.
3) The step up from 192 GB to 512 GB means either they are using denser packages (so the max memory possible for all models goes up) or M4 finally brings the fabled "Extreme"
Aren't they already using the densest packages possible with LPDDR5, for M3 Max?
 

SpudLobby

Senior member
May 18, 2022
989
680
106
People are missing the most interesting detail here, the top memory config of 512 GB.

If that's true it implies several things.

1) Apple is switching from 12Gb DRAMs to 16Gb DRAMs, since you can't make 512 GB with 12Gb parts
2) If they are going to 16Gb DRAMs that likely means LPDDR5X
3) The step up from 192 GB to 512 GB means either they are using denser packages (so the max memory possible for all models goes up) or M4 finally brings the fabled "Extreme"

If they are going LPDDR5X with Apple Silicon I'll bet that's one of the differences between A18 and A18 Pro, with the regular A18 sticking with LPDDR5 for cost reasons. Which would mean for the very first time that the Pro iPhones will be faster.
They did this with the A16 & LPDDR5 6400 vs the A15 & LPDDR4x 4200 with the iPhone 14 Pro and iPhone 14 respectively — Pros had 50% more memory bandwidth.

This would be the first time they have the same number branded and the base models get a “new” SoC and the Pros a Pro version with different RAM though yeah.

Probably not a whole lot, but measurable at least in tasks requiring a lot of memory bandwidth (unless there is also a clock rate difference, cache size difference, etc. as well)
 

Doug S

Platinum Member
Feb 8, 2020
2,784
4,744
136
iPhone 16 Pro/16 Pro Max = A18 Pro
iPhone 16/16 Plus = A18 Bionic

This makes perfect sense.

I guess at some point Apple has decided the cost of doing two iPhone SoCs per year is made up for by the savings on the lower end one - using cheaper LPDDR, a cheaper process, and/or a smaller die.

I don't think they will do it the same way every year. This year it seems it makes sense to make two different designs on N3E, so they can't save money on a cheaper process. If they both use the same LPDDR (or the controllers can be made compatible with both, not sure about the compatibility of 5/5X on the controller side, anyone know?) maybe it is the same die but some cores are disabled on the cheaper one. Maybe they bin for slightly higher clock speeds on the more expensive one. We don't really know yet.

When they hit the N2 generation (which I think will happen in 2026 not next year like I see analysts claiming) I predict the non-Pro version will remain on the N3 family, because there will be a cost step up for N2. The following year maybe they bring LPDDR6 for the Pro and Apple Silicon versions and the non-Pro iPhone sticks with 5X. I know LPDDR6's biggest fan @Tigerick will claim Apple is going to adopt it before 2027 but I'll go on record as betting heavily against that lol

Apple is blazing some new territory here with splitting the iPhone SoCs, it will be interesting to see that plays out. My hunch is that they may want to avoid inflationary pressure to raise prices on the base iPhone so reducing SoC/DRAM cost a bit will help them keep its price steady. The fact their iPhone sales mix goes more Pro every year tells me they have some pricing freedom on the high end, so keeping the base model price the same gives them cover for adding more features/cost on the higher models.
 

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
I have a grim feeling that Apple will do the bare minimum 8 GB -> 12 GB, and not 8 GB -> 16 GB.

Unless, the extra 4 GB of 16 GB is so crucial to AI...
I’m not convinced Apple will upgrade from 8 GB base this year (since 8 GB still actually works fine for light usage), so if they do increase the base memory to 12 GB, that would be a very pleasant and welcome surprise. That would satisfy the needs of entry level users.

But like I said, I’m not convinced they will do even that, and will just stick with 8 GB for this generation. However, at entry level, few care about AI anyway.
 

Doug S

Platinum Member
Feb 8, 2020
2,784
4,744
136
I’m not convinced Apple will upgrade from 8 GB base this year (since 8 GB still actually works fine for light usage), so if they do increase the base memory to 12 GB, that would be a very pleasant and welcome surprise. That would satisfy the needs of entry level users.

But like I said, I’m not convinced they will do even that, and will just stick with 8 GB for this generation. However, at entry level, few care about AI anyway.

Yeah the people who are buying a new PC for the AI hype aren't going to buying a base model, whether they are buying a Mac or a PC.
 

dr1337

Senior member
May 25, 2020
417
691
136
Yeah the people who are buying a new PC for the AI hype aren't going to buying a base model, whether they are buying a Mac or a PC.
But they spend so much time advertising their NPU and all of their AI support, it is weird to any borderline informed customer that apple holds back, so hard, on RAM. My entry level 4gb 3050 lenovo I sought out to try AI on modern Nvidia GPUs is faster in stable diffusion than any M1 benchmark I've seen. And ya know, it just so happens that laptop came with a total of 12gb of RAM between the GPU and CPU... heh

Some portion of the market definitely buys new personal computers without considering specifications or price. But I think a pretty strong majority of computer buyers aren't interested in purchasing a new machine unless it has new features. If your computer already works and is fast enough why would you spend more money unless you have A LOT of cash to burn?

Apple is interested in keeping their stake of high end markets. They purposely pair low vram and storage as an 'entry model' in order to upsell SKUs.

Considering 8gb was standard for a macbook pro 10 years ago and its still standard now shows a regression on apple towards consumer friendliness. There is absolutely no logical reason whatsoever to sell a laptop marketed as "pro", that only comes with 8gb of total RAM in 2024, unless you're specifically abusing a captive market.

They are literally the 2nd most valuable tech company in the world right now, if anyone has enough money to make 16gb standard for a PC, its Apple. They don't need those margins to retain 2nd place, and yet in this same world companies 10x smaller than them don't have as extreme of a margin on RAM. There is no logical argument for their insane price gouging and heinously anti-consumer behavior. Rather, the current board/CEO are clawing at everything they can in order to maintain stock prices and pass on that detritus to the average consumer who (they hope) is too ignorant.

And then they can come around and announce the new M4 or M5 that comes with 12/16gb base model and get tons of praise and headlines. Its really not that hard to see Apple's strategy moving into the 20s.
 

Doug S

Platinum Member
Feb 8, 2020
2,784
4,744
136
But they spend so much time advertising their NPU and all of their AI support, it is weird to any borderline informed customer that apple holds back, so hard, on RAM. My entry level 4gb 3050 lenovo I sought out to try AI on modern Nvidia GPUs is faster in stable diffusion than any M1 benchmark I've seen. And ya know, it just so happens that laptop came with a total of 12gb of RAM between the GPU and CPU... heh

Some portion of the market definitely buys new personal computers without considering specifications or price. But I think a pretty strong majority of computer buyers aren't interested in purchasing a new machine unless it has new features. If your computer already works and is fast enough why would you spend more money unless you have A LOT of cash to burn?

Apple is interested in keeping their stake of high end markets. They purposely pair low vram and storage as an 'entry model' in order to upsell SKUs.

Considering 8gb was standard for a macbook pro 10 years ago and its still standard now shows a regression on apple towards consumer friendliness. There is absolutely no logical reason whatsoever to sell a laptop marketed as "pro", that only comes with 8gb of total RAM in 2024, unless you're specifically abusing a captive market.

They are literally the 2nd most valuable tech company in the world right now, if anyone has enough money to make 16gb standard for a PC, its Apple. They don't need those margins to retain 2nd place, and yet in this same world companies 10x smaller than them don't have as extreme of a margin on RAM. There is no logical argument for their insane price gouging and heinously anti-consumer behavior. Rather, the current board/CEO are clawing at everything they can in order to maintain stock prices and pass on that detritus to the average consumer who (they hope) is too ignorant.

And then they can come around and announce the new M4 or M5 that comes with 12/16gb base model and get tons of praise and headlines. Its really not that hard to see Apple's strategy moving into the 20s.

They will provide enough RAM to make their AI run well. Apple gets a lot of out of their RAM due to aggressive use of page compression and Safari isn't as much of a memory hog as Chrome.

Don't discount the impact of being 64 bit only either. Windows systems have to keep both 32 and 64 libraries resident in memory for most people because they run a mix of 32 and 64 bits. (off topic, but are all parts of Windows 11 64 bit now? I remember for years stuff some of the Administrative tools still had the old NT 4.0 look years after they've moved on to XP and even 7, so I wouldn't be surprised if they're still 32 bit executables)

That said I agree Apple needs to move on from 8 GB. They may run a lot better with 8 GB than a PC does, but that's no excuse given that even their low end Macs are sold as premium products.
 

poke01

Platinum Member
Mar 8, 2022
2,205
2,802
106
But they spend so much time advertising their NPU and all of their AI support, it is weird to any borderline informed customer that apple holds back, so hard, on RAM. My entry level 4gb 3050 lenovo I sought out to try AI on modern Nvidia GPUs is faster in stable diffusion than any M1 benchmark I've seen. And ya know, it just so happens that laptop came with a total of 12gb of RAM between the GPU and CPU... heh

Some portion of the market definitely buys new personal computers without considering specifications or price. But I think a pretty strong majority of computer buyers aren't interested in purchasing a new machine unless it has new features. If your computer already works and is fast enough why would you spend more money unless you have A LOT of cash to burn?

Apple is interested in keeping their stake of high end markets. They purposely pair low vram and storage as an 'entry model' in order to upsell SKUs.

Considering 8gb was standard for a macbook pro 10 years ago and its still standard now shows a regression on apple towards consumer friendliness. There is absolutely no logical reason whatsoever to sell a laptop marketed as "pro", that only comes with 8gb of total RAM in 2024, unless you're specifically abusing a captive market.

They are literally the 2nd most valuable tech company in the world right now, if anyone has enough money to make 16gb standard for a PC, its Apple. They don't need those margins to retain 2nd place, and yet in this same world companies 10x smaller than them don't have as extreme of a margin on RAM. There is no logical argument for their insane price gouging and heinously anti-consumer behavior. Rather, the current board/CEO are clawing at everything they can in order to maintain stock prices and pass on that detritus to the average consumer who (they hope) is too ignorant.

And then they can come around and announce the new M4 or M5 that comes with 12/16gb base model and get tons of praise and headlines. Its really not that hard to see Apple's strategy moving into the 20s.
Notice how they included the 16GB as a standard config instead of built to order for both the M3 Air and M3 Pros in March. 100% sure that was because of the bad press they got last year due to 8GB drama. Now third party stores will have discounts on the 16GB models well.

Thats why reviewers have to keep on pressuring Apple about this. Confront them.
 
  • Like
Reactions: Tlh97 and FlameTail

FlameTail

Diamond Member
Dec 15, 2021
3,950
2,376
106
Notice how they included the 16GB as a standard config instead of built to order for both the M3 Air and M3 Pros in March. 100% sure that was because of the bad press they got last year due to 8GB drama. Now third party stores will have discounts on the 16GB models well.
Very good. Very good.
Hopefully, the next step is the complete elimination of the 8 GB model.

But then, this news came out recently;


So I am not sure...
 

FlameTail

Diamond Member
Dec 15, 2021
3,950
2,376
106
People are missing the most interesting detail here, the top memory config of 512 GB.

3) The step up from 192 GB to 512 GB means either they are using denser packages (so the max memory possible for all models goes up)
That 512 GB max memory is going to be a hit. It will be amazing to run AI models.

Right now if you want to run AI models, you can buy a consumer GPU like the RTX 4090 - but you are limited by the 24 GB vRAM. If you want a card with more vRAM, you'll have to shell thousands or tens-of-thousands of dollars to get a card like the A100 or H100.

Because of this, some people are running AI models using their Macs (M Max/M Ultra chip configured with high capacity of RAM). Because of the Unified Memory architecture, the NPU and GPU can also make use of all that RAM.

These people will gladly welcome the beefier Neural Engine and increased RAM capacity of the M4.