Discussion Apple Silicon SoC thread

Page 30 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,583
996
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:

 
Last edited:

name99

Senior member
Sep 11, 2010
404
303
136
M1 using an SLC cache? It's gonna wear out eventually. What then? Does the device go into a low performance mode? Or does it stop working altogether?

Memory bandwidth issues may get solved with four channels of DDR5 in the future though that's likely a year or two into the future.

If Parallels can deliver the 70-80% performance of native x86 code execution, that could conceivably meet or exceed the highest level of performance delivered by contemporary high end x86 CPUs.

Apple seems to be content with targeting the content creation market. The x86 gaming market is safe for now. Intel and AMD can breathe easy. The only other company with the remotest chance of challenging Apple in performance could be nVidia with their ARM IP but that could take maybe 5 years or longer.

SLC cache = System Level Cache
Something like an L3 cache but optimized for
(a) saving power by allowing the GPU frequently to avoid accessing DRAM
(b) allowing all the different accelerators to exchange data with each other and the CPUs rapidly.
 

mikegg

Golden Member
Jan 30, 2010
1,740
406
136
Right from when ARM Macs were announced, iOS apps on Macs never made any sense, it seems like it would be the same kind of garbage experience Apple is famous for avoiding.

I wonder who, high up in Apple thought this was an idea worth pushing.
You have no idea what you're talking about.

With XCode, you can now design apps that can change the layout based on devices and screensize, like mobile web responsive designs. So a single app can be updated to look different on iOS, iPadOS, and MacOS.

Of course on day 1, apps don't have this support. But app developers will build this into their apps in the future.
 

name99

Senior member
Sep 11, 2010
404
303
136
I pretty much agree fully there. Very impressed with the GPU. I will also say I am extremely impressed with Apple's x86 emulator though, that's a FAST emulator. Clearly a tremendous amount of engineering resources went into that emulator. Clearly a tremendous amount of resources went into the initial software offerings for the Apple ARM platform in general. I do wonder if that same high quality and super-fast code will be the norm, though? Apple obviously had tremendous reason to make sure the debut software was tip-top and super optimized.

Really the only thing that "annoyed" me about the review was the Speedometer benchmark. Which browsers were they using? I feel like the other fair test would have been to use the same browser on every platform.

The context of his post is in reference to the uncore power, not necessarily to the connectivity and I/O. The answer is, the M1 has very little I/O. No PCI-E lanes, No SATA. It's got a single display PHY (well, not counting the thunderbolt monitor muxing), a handful of USB/Thunderbolt PHYs, Ethernet, single audio port, and that's about it. Desktop SOC from Intel and AMD have 4x the I/O going on, which contributes heavily to "uncore" power consumption. Especially the memory controller needing to drive a fat 128-bit bus across a PCB. These things just cannot be overcome while still retaining the characteristics of a typical PC notebook.

But again that’s not a sensible way to look at the issue. He’s making assumptions about the energy usage of these lanes and IO functionality (especially when idle) that may be valid for PCs but are not necessarily valid for M1 machines.

I just don’t understand the logic here. I write posts about how DRAM on the SoC saves a lot of power and get told this is nonsense, the only reason Apple is doing this is because they want “a control”. Then the same people say sure Apple can have lower power, because they don’t have to drive that power hungry memory bus!

Yes, there’s no legacy IO. Yes, this saves power (and helps with reliability and some performance) Why do you think some of us have been grumbling about Intel legacy IO for years?
Intel could have phased out this junk years ago. They did not and Apple did. That‘s why Apple gets to look good and x86 does not.
That’s how “being the company that makes the change” WORKS!
 
  • Like
Reactions: Tlh97 and Qwertilot

Staples

Diamond Member
Oct 28, 2001
4,952
119
106
This open-vs-closed ecosystem debate isn't as important for the end-user as you think it is.
Well kind of on a related note, Apple only supports updates for ~7 years and then you can't update the OS. Then once you are two versions behind on the OS, you can't install anything from the app store.
 

mikegg

Golden Member
Jan 30, 2010
1,740
406
136
Well kind of on a related note, Apple only supports updates for ~7 years and then you can't update the OS.
How many years do Dell/HP/Google/Samsung support phones and laptops in terms of updates?

If I buy a used Samsung Android phone from 2012, do I get the latest OS updates from Google/Samsung?

Then once you are two versions behind on the OS, you can't install anything from the app store.
So basically, you have 9 years to install things from the App Store. 7 years of support + 2 years of new updates. Not bad.
 

Entropyq3

Junior Member
Jan 24, 2005
22
22
81
If you switch to a closed platform, you loose your "nerd" designation.
Yup. You are promoted to "geek". :cool:

Seriously, platform choice is a game of compromises. At this point in time, the largest personal computing platform, by a huge margin, is Android. This "PC vs. Mac" bickering is over 35 years old by now, it’s the stuff old geezers argue about on park benches as they feed the pidgeons. :)
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,743
3,075
136
Please quote where I said the M1 beats everything in everything. I'll wait.

And now it's most definitely not true for laptops. The $1,000 Macbook Air's CPU is faster than any Windows laptop, period. And has way longer battery life.

and if you want to argue , i'll point you at
Period at the end of the sentence means, the things said in the sentence are definite and no change is allowed.
 

mikegg

Golden Member
Jan 30, 2010
1,740
406
136

senttoschool said:
Please quote where I said the M1 beats everything in everything. I'll wait.


senttoschool said:
And now it's most definitely not true for laptops. The $1,000 Macbook Air's CPU is faster than any Windows laptop, period. And has way longer battery life.

Uh... everything I said here is true. Period. I do think you're confused by the definition of "faster than". Here's my definition which I have stated multiple times already in this thread.


Wait what? Example? Didn't benchmarks prove that the M1 has the fastest overall CPU performance out of any laptop chip?
Ok, yea I'm aware that it doesn't win every multi-core benchmark. That doesn't mean it isn't the fastest laptop chip overall. It clearly *is* the fastest laptop chip overall in the world for the vast majority of people. It only loses in a few multi-core benchmarks.

So I'm still waiting for you to quote me on where I said the M1 is faster than everything in everything, which no reasonable person thinks is true and I'm quite reasonable.
 
Last edited:

Antey

Member
Jul 4, 2019
105
153
116
apple graphics architecture is a bit of a mistery but it's seems that each apple core has the same arrangement as an intel Xe subslice. each core has 16 execution units with 8 fp32 ops per clock each, 8 TMUs, 4 ROPs. it's like each core is an intel subslices. i would like to know how an intel xe with 1 slice (8 subslices and 128 EUs) or other interesting combination fares against the apple m1 gpu. that high bandwidth system cache and l3 cache (don't know the size) is also a big advantage. tiger lake could have a big pool of l3 cache but with tiger lake it only has 3,8 MB of it (and i read it also uses LLC cache so i guess it's not much of bandwidth problem) . even if we don't know much about it, that gpu design is perfectly balanced, 10/10.
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,012
923
136
This "PC vs. Mac" bickering is over 35 years old by now, it’s the stuff old geezers argue about on park benches as they feed the pidgeons. :)
But 35 years ago a Apple didn't lock down everything and build hardware with purposeful built-in-obsolescence.
Yes, there were trade-offs in 80s and 90s Macs versus PCs, but they weren't the same as today.
Hardware-wise Mac II's and early PowerMac were certainly repairable and upgradeable.
And software-wise, System 7 was about years ahead of Microsoft.
These days? Far too locked for me.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Way to cherry-pick a benchmark. 99.99% of people don't ever use Cinema4D.

100% of people don't use GB5 (to do any meaningful work). At least Cinema4D is good for something.

Here's my definition which I have stated multiple times already in this thread.

Oh good, now you get to create your own definitions of common phraseology as well?

The M1 is an impressive chip. Bravo to Apple for rolling this out - it's a shame we can't pick a different operating system for it (Linux) or get different configurations. That's Apple for you. Sadly some users seem intent upon making people hate the M1 by misrepresenting it or simply being obstreperous.
 

xblax

Member
Feb 20, 2017
54
70
61
What are some things nerds like to do on Android that they can't do on iOS? Same question for Windows and MacOS.

Install (open source) apps from third party app stores (such as F-Droid), have the phone rooted to allow various operations that are "disallowed" by Google / Phone vendors, install custom Roms such as LineageOS. I am personally doing all that, but of course the majority of people is not into that stuff.

iPhones are closed in more common areas too. Not having MicroSD card storage expansion slots, not allowing NFC host card emulation for apps (Apple reserves NFC functionality for their own Wallet app).

Instead of Windows and MacOS, I tend to use Linux instead ;-) I think the Apple M1 is absolutely interesting and amazing technology, but the constraints that come with it are an absolute deal-breaker for people like me.
 
Apr 30, 2020
68
170
76
But again that’s not a sensible way to look at the issue. He’s making assumptions about the energy usage of these lanes and IO functionality (especially when idle) that may be valid for PCs but are not necessarily valid for M1 machines.

I just don’t understand the logic here. I write posts about how DRAM on the SoC saves a lot of power and get told this is nonsense, the only reason Apple is doing this is because they want “a control”. Then the same people say sure Apple can have lower power, because they don’t have to drive that power hungry memory bus!

Yes, there’s no legacy IO. Yes, this saves power (and helps with reliability and some performance) Why do you think some of us have been grumbling about Intel legacy IO for years?
Intel could have phased out this junk years ago. They did not and Apple did. That‘s why Apple gets to look good and x86 does not.
That’s how “being the company that makes the change” WORKS!
I/O costs power, that's a fact. Powering large DRAM busses over large distances requires a lot of power. That is a fact. It's not "nonsense" just because you don't like hearing it. A wide of assortment of I/O isn't "legacy junk" just because you don't use it.

  • - PCI-E is not "legacy junk". No PCIe = no NVMe drives, no dGPUs, no express card, no PCIe over thunderbolt passthrough to external enclosures.
  • - No NVMe or SATA = No user serviceable internal storage options. No way to expand internal storage. You are stuck with what you got. Thanks to the integrated SSD controller, if something happens to your SOC (say CPU/GPU/DRAM failure), then all of your data is basically as good as gone unless you find someone capable of desoldering the NAND and somehow extracting the raw data and re-assembling it into something coherent.
  • - Integrated RAM = Not possible to upgrade RAM at all. Stuck with what you got. Many PC notebooks today do have at least some ram soldered, but almost all offer at least one slot to expand.
  • - No integrated hd audio (besides the one 3.5mm port), means I'm forced to use external USB audio devices if I want to do basic line-in recording or use a wired mic.
  • - No integrated display PHYs = I'm forced to use dongles and USB-C/Thunderbolt hubs if I want to connect to multiple monitors
  • - Limited IO ports = piles of pricey dongles and adapters to hook up your peripherals

Sure this stuff works for users who treat their PCs like phones, and many existing Apple users. But all of that is a non-starter for many, many PC users.

You keep acting like this is an attack on your person. I'm pointing out that Apple has made some serious sacrifices and design comprimses to get the M1 power usage down, and performance up. I'm not saying those compromises are "wrong", but they are compromises. You can't just handwave it all away and say it's a non issue when you're trying to draw comparisons to Intel and AMD systems who have ecosystems and customers that rely on that I/O.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
I/O costs power, that's a fact. Powering large DRAM busses over large distances requires a lot of power. That is a fact. It's not "nonsense" just because you don't like hearing it. A wide of assortment of I/O isn't "legacy junk" just because you don't use it.

  • - PCI-E is not "legacy junk". No PCIe = no NVMe drives, no dGPUs, no express card, no PCIe over thunderbolt passthrough to external enclosures.
  • - No NVMe or SATA = No user serviceable internal storage options. No way to expand internal storage. You are stuck with what you got. Thanks to the integrated SSD controller, if something happens to your SOC (say CPU/GPU/DRAM failure), then all of your data is basically as good as gone unless you find someone capable of desoldering the NAND and somehow extracting the raw data and re-assembling it into something coherent.
  • - Integrated RAM = Not possible to upgrade RAM at all. Stuck with what you got. Many PC notebooks today do have at least some ram soldered, but almost all offer at least one slot to expand.
  • - No integrated hd audio (besides the one 3.5mm port), means I'm forced to use external USB audio devices if I want to do basic line-in recording or use a wired mic.
  • - No integrated display PHYs = I'm forced to use dongles and USB-C/Thunderbolt hubs if I want to connect to multiple monitors
  • - Limited IO ports = piles of pricey dongles and adapters to hook up your peripherals

Sure this stuff works for users who treat their PCs like phones, and many existing Apple users. But all of that is a non-starter for many, many PC users.

You keep acting like this is an attack on your person. I'm pointing out that Apple has made some serious sacrifices and design comprimses to get the M1 power usage down, and performance up. I'm not saying those compromises are "wrong", but they are compromises. You can't just handwave it all away and say it's a non issue when you're trying to draw comparisons to Intel and AMD systems who have ecosystems and customers that rely on that I/O.
Dude you're greatly mistaken.

I'm 100% there is / there will be an advanced and superior-in-every-way-3timesfaster proprietary Apple solution for all these problems for only a couple hundred dollars per piece.
 
  • Like
Reactions: MangoX and Tlh97

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
It's been a while since I used Android and Windows. How much customization do you really get nowadays? Isn't Google trending towards a standard Android experience anyway? And Google is trying its hardest to close any loop holes for its 30% cut in app revenue.

What are some things nerds like to do on Android that they can't do on iOS? Same question for Windows and MacOS.

It's not only about the OS. I can install Linux on my PC if for some reason MS goes complete bonkers.

With android it depends if the phone can be rooted but even without it you can install different App stores or apks directly. Ok, I assume that's possible on macOS too but not iOS/iPhones but with this change you can't install a different OS on the new Macbooks if apple goes bonkers or you have a reason to trust them even less.
(see more below

This open-vs-closed ecosystem debate isn't as important for the end-user as you think it is.

True and at this point it doesn't yet matter that much. I'm thinking long term and in terms of abndaoning free spech and moving to censorship which is getting worse and worse with the increased "political correctness" movements. I can't even explicitly write here what I mean or else I get at least a temp ban. Big corps will more and more censor the "uncommon opinion". It's already forbidden in apps themselves. Next developers that voice their uncommon opinion will get their keys revoked ( couple years max) and then at one point they lock users which "voice" an unpopular opinion out of their device(s). Basically more and more like in China.
 
  • Like
Reactions: Thunder 57