Discussion Apple Silicon SoC thread

Page 455 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
24,176
1,816
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:


M5 Family discussion here:

 
Last edited:

jdubs03

Golden Member
Oct 1, 2013
1,493
1,076
136
Interesting clue:
Would be very interesting to not see a M5 Pro in the MBP.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,418
33,400
146
Can you take this Off-Topic waste of time somewhere else?
This.

We have both private messaging and ignore features. Use them if you really want to dig in because someone is wrong on the internet, or gets under your skin.
 

poke01

Diamond Member
Mar 8, 2022
4,861
6,193
106
The problem is even I ignore, other people will reply to the off-topic post and bloat up the thread.

Best thing is to stay on topic. Please only post about Apple Sillicon and hardware, software
 
  • Like
Reactions: mvprod123

poke01

Diamond Member
Mar 8, 2022
4,861
6,193
106
Interesting clue:
Would be very interesting to not see a M5 Pro in the MBP.
This is likely for the Studio. I doubt the Pro SKU is going away.
 
  • Like
Reactions: mvprod123

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,418
33,400
146
The problem is even I ignore, other people will reply to the off-topic post and bloat up the thread.
Let me know, and I will delete the posts causing issues.
Best thing is to stay on topic. Please only post about Apple Sillicon and hardware, software
Agreed. But some compare and contrast is to be expected. Report anything you think goes beyond innocuous.
 
  • Like
Reactions: poke01

coercitiv

Diamond Member
Jan 24, 2014
7,484
17,882
136
Nobody is encoding videos with a phone, nobody is running a NAS with a mobile phone. Nobody is seriously programming through a mobile phone, and definitely nobody is running multiple monitors and multi-tasking on a phone.
What percentage of the world population owns a NAS, does programming or uses multiple monitors? For most people around the world the smartphone is probably the first computing device and the only one they own, and even in developed countries a large number of people use phones for computing int heir daily lives.

Phones are mainly SMS/Phones, Email clients, Cameras, and a mobile video player. Its not like you get desktop quality youtube videos on a phone without jumping through some unofficial hoops. You can argue thats the only thing people need computers for
Yes, the majority of people dictate what the common denominator for personal computing is, even if you and me agree a true computing experience is much much more than that.

Keep in mind the sole purpose of my post was to counter the original claim:
If so why don't you use your phone as computer?

Why doesn't almost anyone at all?
Lately @fastandfurious6 likes to make hyperbolic and unfounded generalized claims about things and people, then retreats to a contextualized safe position that is borderline moving the goal post.

The original claim was "use phone as computer", it's as broad as it gets. For the average Joe the smartphone is a computerphone.
 

MS_AT

Senior member
Jul 15, 2024
936
1,856
96
The more interesting thing would be will Apple even consider creating dockable Iphones that support proper desktop UI. I mean majority of the people in this thread seem to agree that for majority of users, Iphone computing power is sufficient. The only problem I see, is that doing home accounting etc on 5 inch screen is tedious without a mouse. Android at least seems to try to go towards that direction but for Apple it seems it could undercut sales of cheapest macbook airs/mac minis.

I am still of the opinion that flagship smartphones compute power gets wasted without a dockable option. As smartphone is basically an internet terminal, regardless if you are using a browser or an app that's just a frontend to a server somewhere far away. The heaviest computing is probably done on image processing. So non-flagship phones are usually sufficient for the majority of the market when it comes to computing power.

Just my 2 cents.
 

MerryCherry

Junior Member
Jan 25, 2026
19
28
46
What is the power consumption of the M4 Max in an all-core workload like Cinebench? A Google search is yielding numbers anywhere from 50W to 200W.

small_cinebench-snapdragon-x2-performance.png
I am curious where M4 Max will sit in this graph.
 

Covfefe

Member
Jul 23, 2025
107
192
76
What is the power consumption of the M4 Max in an all-core workload like Cinebench? A Google search is yielding numbers anywhere from 50W to 200W.

View attachment 137912
I am curious where M4 Max will sit in this graph.
The M4 Max scored 2015 points in cinebench 2024 in Notebookcheck's testing. Power consumption is 92 watts in the benchmark and 6 watts idle. INNP is 86 watts.
 

MerryCherry

Junior Member
Jan 25, 2026
19
28
46
Well, that tracks with the notion that Oryon is roughly a generation behind Apple in efficiency, and why Apple is conspicuously absent from their graphs.
 

jdubs03

Golden Member
Oct 1, 2013
1,493
1,076
136
Well, that tracks with the notion that Oryon is roughly a generation behind Apple in efficiency, and why Apple is conspicuously absent from their graphs.
Edit: thinking about it a bit more; we’ll probably see similar efficiency to a slight regression as the boost in e-core performance at same power draw probably may not compensate for the increased power draw of the p-cores.
 
Last edited:

The Hardcard

Senior member
Oct 19, 2021
353
448
136
This is likely for the Studio. I doubt the Pro SKU is going away.
Apple has recently aggressively turned toward software and hardware for the desktop AI workstation market. This seems to me a strong possibility that they are making new Studios a top priority. The M5 Max and Ultra coming out in February will capture nearly the entire market.

And scarily, they have massive latitude to set pricing. The M4 Max with 128 GB is currently $3500, they could triple that price and still sell every one with a waiting list. The 512 GB M3 Ultra is $9500. The M5 version could sell with the price quadrupled.
 

GC2:CS

Member
Jul 6, 2018
40
21
81
In theory you are 100% correct based on older rules, however,

in practice: there is nothing on the horizon (other than big LLM) that mandates big cores or whatever. This is exactly the same reason why iphone 13 is almost as good as iphone 17 right now. Apple silicon is way too good. M2 equivalent to phone chip resolves everything a phone is used for. Yes it's a computer with same core type as big M, however phones are phones and not used as computers.


tl;dr: since iphone 13, iphone CPUs are too strong and any upgrades don't really offer much in reality.

Think of how PCIe 5.0 SSDs offer double speeds in sequential but in real world don't really offer much practical benefits - this is something virtually everyone in hardware space says.

Exactly same thing applies on phone CPUs - there is nothing to reap the benefits / not frequently used (just like sequential transfer speeds).




The fact you can't understand what I'm trying to convey is not really my problem. I try to use plain and descriptive language.

I made the same question many times: What is on the horizon that mandates more computing power per se (except LLM)?

The same question applies for phones, for desktops, for laptops, for any similar computing device. As well as specific types of apps that traditionally pushed the boundaries i.e. Videogames.



To this date, nobody has given a real answer yet. Responses range from nonsense, to generic theory, to old rules that don't apply anymore, to attacks on character.


Basically: there is nothing on the horizon that mandates more computing power per se (except LLMs). Performance requirements are plateauing: required CPU, GPU, RAM, storage... all plateauing.

Well we can mandate ever more performance for the sake of economics.

If there would not be a new revolutionary chip or software every year, what else would people work on ?

We need to build those AI space datacenters someday after all.
 

poke01

Diamond Member
Mar 8, 2022
4,861
6,193
106
Just saying people and reviewers DID complain when Apple used the A15 in the iPhone 14 because it was same chip used in iPhone 13. They didn’t like the feeling, felt like a recycled phone.

There is a big leap in efficiency going to iPhone 17 from an iPhone 13 just because of the SoC. The A19 is really efficient because of its architecture and node advancements. It runs cooler and battery lasts longer plus the ISP and video encoder/decoder advancements.

Why would anyone want a 2021 SoC in a 2026 phone? Apple would be behind it’s competitors and since the iPhone literally finances the Mac chips it’s important that the latest IP is used in the iPhone.

(I’m sorry this is last post on this matter from me, getting stupidly annoying)
 
  • Like
Reactions: oak8292

name99

Senior member
Sep 11, 2010
690
581
136
This article makes zero sense to me. Feels like word salad.

1. Separate GPU and CPU chiplets have various advantages, but lower cost compared to a monolithic SoC are not one of those advantages.

2. M1 and upwards are ALREADY 2.5D packaged by most understandings of the term! (ie chips placed side by side, connected by an RDL [a dense network of traces]).
 
  • Like
Reactions: digitaldreamer

dr1337

Senior member
May 25, 2020
547
834
136
What percentage of the world population owns a NAS, does programming or uses multiple monitors? For most people around the world the smartphone is probably the first computing device and the only one they own, and even in developed countries a large number of people use phones for computing int heir daily lives.
I mean the PC industry by itself is a $250bn yearly market, there are a lot of people using laptops and desktops, moreso than folks who buy new every year. Pushing the goalposts to the entire population of the earth is ignorant IMO. How many people have access to food, clean water, and shelter? Where you live its probably a lot better than for "most people".

If apple could replace all mac sales with iphone they probably would, it would be higher margins for sure.

Yes, the majority of people dictate what the common denominator for personal computing is, even if you and me agree a true computing experience is much much more than that.
Nah you're trying to change what they were saying. Its pretty clear their definition of "computer" used in that reply was not lowest common denominator of the entire world. Nobody who has the words "personal computer" in their lingua franca is using it exchangeably with "smart phone" in terms of device capability. Even in the lowest common denominator people aren't using phones as "computers".

otherwise, there'd be a billion videos about how to replace your mac/pc/laptop/whatever with the latest new phone, I dont see them.
 

Doug S

Diamond Member
Feb 8, 2020
3,832
6,767
136
I’m skeptical about that as well. But it is curious that the identifier for the M5 Pro is missing.

The "its for the Studio" would make sense, especially if Gurman was right that the Macbook Pro comes in March.

Or maybe the M5 Max Macbook Pro comes out now and the M5 Pro follows a month later. If Apple is capacity constrained like Cook said, it makes sense to release the higher profit Max SKUs first, and lag the Pro.

Though if the new chiplet/packaging stuff gives you more configuration freedom, maybe there is no "Pro" any longer (the "Macbook Pro with M5 Pro" was always an unwieldy name anyway) If you go for the lowest end/default setup maybe that's comparable to what was a Pro but if you beef it up much it is more like Max or maybe even beyond it (i.e. if you wanted to take the minimum/default GPU core count and max out your CPU core count it may be close to Ultra in CPU capability)

We'll have to see how much choice there is for Macbook Pro configuration. I think it might have just two levels each for CPU and GPU but who knows.
 
  • Like
Reactions: smalM

fkoehler

Senior member
Feb 29, 2008
219
189
116
The more interesting thing would be will Apple even consider creating dockable Iphones that support proper desktop UI.

We've seen early attempts to do so with much less powerful devices, however it never got very far because at the time most everyone had a desktop or laptop which made the hassle not worth the time.

Problem with Apple, or anyone else really going for a dockable phone or tablet with a decent OS is their primary concern would be cannibalizing their laptop sales.

If phone sales ever really start to stagnate, like for Apple, I would expect this 'new, original feature' to become the current hotness.
 
  • Like
Reactions: Gideon

fkoehler

Senior member
Feb 29, 2008
219
189
116
This article makes zero sense to me. Feels like word salad.

1. Separate GPU and CPU chiplets have various advantages, but lower cost compared to a monolithic SoC are not one of those advantages.

2. M1 and upwards are ALREADY 2.5D packaged by most understandings of the term! (ie chips placed side by side, connected by an RDL [a dense network of traces]).
If you have the option of making one giant die at 85% yield, or 2 smaller dies at 90-95% yield, at a lower cost per die or combined, why would you not.