Discussion Apple Silicon SoC thread

Page 75 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,586
1,000
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:

 
Last edited:

Doug S

Platinum Member
Feb 8, 2020
2,235
3,453
136
That's really irrelevant for getting native Linux running optimally.

The stumbling block is creating good reverse engineered drivers for all the custom hardware, or you end up leaving a lot of functionality behind.


It depends on what the real goal of this is. I think most of these people are doing it for intellectual curiousity, not trying to produce the most performant possible Linux PC. Is it really necessary to have maximally accelerated GPU drivers, for instance? Or take advantage of every method Apple provides for CPU power saving? They may be able to reverse engineer some of that, while other bits remain a mystery, and end up with something highly usable that falls a bit short of matching the full performance and efficiency of the same Mac running macOS. If it is 90 or 95% of the way there, I don't think anyone would consider the effort to be less than a rousing success.

It sounds like at least some of the undocumented Apple exclusive hardware (aside from the CPU & GPU) may not be changing much year to year, so they won't be wasting time reverse engineering e.g. a new uART every year. Stuff like wifi and bluetooth come from Broadcom and is presumably the same as or similar enough to Broadcom devices that have Linux drivers that they aren't any problem. I wouldn't be surprised if they license IP blocks for stuff like PCIe and USB rather than roll their own, if so there will be Linux drivers to be had for those as well. So you probably don't leave much functionality behind, just a certain (hopefully modest) percentage of performance and power efficiency.

It just isn't going to be worth the effort to wring every last bit of performance and efficiency out of the CPU and GPU. The 80/20 or 90/10 law would probably make that a real slog and then you likely have to do it all over again every year or two. Maybe the people doing this will have their interest piqued long enough or be able to derive enough information from examining macOS drivers to reach 100%, but if they don't that's fine.
 
  • Like
Reactions: Ajay and moinmoin

Eug

Lifer
Mar 11, 2000
23,586
1,000
126
Is it really necessary to have maximally accelerated GPU drivers, for instance?
I gather that that Asahi Linux distro is doing the graphics entirely in software. That is likely a big problem if you actually want to use the thing. Even just visiting a modern web page would probably be very irritating, esp. if you're using a 4K monitor or something.

Similarly, my 2010 Core i7-870 iMac is limited to OS X 10.13 High Sierra. Technically you can install later OSes, but there is no graphics acceleration for its Radeon HD 5750, which means the machine is basically unusable. It works, but everything on screen is sloowwww.

In contrast, UI speed for my 2008 aluminum unibody Core 2.0 GHz MacBook is just fine running even 10.15 Catalina, because there is proper hardware acceleration utilizing its ancient GeForce 9400M GPU. (That machine is supposed to max out at 10.11 El Capitan.)

Mind you, now I'm curious to see just how bad Linux UI speed is on M1. :D M1 is fast, and Linux has less overhead so maybe it could be somewhat tolerable.

Or take advantage of every method Apple provides for CPU power saving?
I dunno what they've done with fan control, but on some machines if you don't specifically control the fans, either they ramp up to maximum and just stay there, or else they ramp up very quickly and generally run much higher than usually necessary. The good news though is that M1 is cool to begin with, and the M1 MacBook Air for example is fanless so fan control doesn't apply.
 
Last edited:
  • Like
Reactions: Heartbreaker

moinmoin

Diamond Member
Jun 1, 2017
4,944
7,656
136
Even just visiting a modern web page would probably be very irritating, esp. if you're using a 4K monitor or something.
Honestly it's not that bad actually. I'm somewhat pivoting between enabling and disabling actual hardware acceleration in my main Chromium based browser. Disabled obviously disables WebGL so stuff like smooth zooming in Google Maps is disabled (and that use case is very awkward to use without indeed), but most bells and whistles still work fine in software and even appear to use less battery in general use (which was quite the surprise to me).

In general lack of hardware acceleration is obviously not optimal, but in my experience the software counterparts are pretty well optimized under Linux as a result.

The Asahi Linux project is advancing very well considering how young both the chip, platform and the effort is (and thanks to all improvements being directly contributed to the mainline kernel) the turnaround time is incredible short for completely unsupported hardware. And I'm sure people with scientific interest in all the special blocks Apple uses will keep the efforts going even if full support is very unlikely to be achieved ever.
 

Ajay

Lifer
Jan 8, 2001
15,406
7,833
136
That's really irrelevant for getting native Linux running optimally.

The stumbling block is creating good reverse engineered drivers for all the custom hardware, or you end up leaving a lot of functionality behind.
So, apparently I phrased my comments incorrectly. I would be able experiment with a *NIX environment that can run some Linux programs and commands as well (at least with the right libraries installed). So long as I can bring up a BASH shell, I’d be pretty happy. If I can do some MacOS admin stuff via CLI that would cool. But this would just be bonus stuff. The idea is to get used to using MacOS again, since I have seen more job opps listing MacOS, and learning how to code for IOS devices. This is just an idea rattling around in my head ATM.

Sorry all for the derail.
 

Eug

Lifer
Mar 11, 2000
23,586
1,000
126
Honestly it's not that bad actually. I'm somewhat pivoting between enabling and disabling actual hardware acceleration in my main Chromium based browser. Disabled obviously disables WebGL so stuff like smooth zooming in Google Maps is disabled (and that use case is very awkward to use without indeed), but most bells and whistles still work fine in software and even appear to use less battery in general use (which was quite the surprise to me).
Well, as you say, it depends on the web page.

But then again, if you aren't actually going to use it as a primary surfing machine then it may not matter. Plus, some of the important stuff like banking websites for example usually would be OK, since their focus is security, not to wow you with their multimedia effects.

So, apparently I phrased my comments incorrectly. I would be able experiment with a *NIX environment that can run some Linux programs and commands as well (at least with the right libraries installed). So long as I can bring up a BASH shell, I’d be pretty happy. If I can do some MacOS admin stuff via CLI that would cool. But this would just be bonus stuff. The idea is to get used to using MacOS again, since I have seen more job opps listing MacOS, and learning how to code for IOS devices. This is just an idea rattling around in my head ATM.
OK, yeah that makes more sense.

Since I'm not a *nix jockey, I was partially looking at it as more of a Linux-on-the-desktop scenario. I can tell you that while Ubuntu worked on my old 2008 white MacBook4,1, I hated it because of weird incompatibilities and various bugs, plus a lot of drivers were sub-par or simply unavailable. I also tried CloudReady ChromeOS and that was a problem too. Both of those are supposedly excellent choices for old MacBooks, but if that is what they call "excellent", I don't know what less than excellent would look like. :oops:
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
So, apparently I phrased my comments incorrectly. I would be able experiment with a *NIX environment that can run some Linux programs and commands as well (at least with the right libraries installed). So long as I can bring up a BASH shell, I’d be pretty happy. If I can do some MacOS admin stuff via CLI that would cool. But this would just be bonus stuff. The idea is to get used to using MacOS again, since I have seen more job opps listing MacOS, and learning how to code for IOS devices. This is just an idea rattling around in my head ATM.

But you can probably do that just as well, if not better in a Linux VM on the native OS, and of course that gives you access to the native OS without dual booting.
 

Ajay

Lifer
Jan 8, 2001
15,406
7,833
136
But you can probably do that just as well, if not better in a Linux VM on the native OS, and of course that gives you access to the native OS without dual booting.
I am currently running Pop!_OS in VMware Workstation at present. Was thinking about dual booting (to force myself to learn more). Anyway, my main point wrt to buying a Mac mini isn't about *NIX stuff; so I'm going to stop talking about Linux in this thread. No offense to anyone.
 

Doug S

Platinum Member
Feb 8, 2020
2,235
3,453
136
I gather that that Asahi Linux distro is doing the graphics entirely in software. That is likely a big problem if you actually want to use the thing.


The M1 is apparently fast enough to handle that in software and still be faster than GPU acceleration on lesser ARM chipsets like Rockchip. Hardly a "big problem", unless your goal is to run CAD software or play demanding games. For a desktop, web browser, office apps etc. it would be fine.

One of the people involved in the work bringing GPU acceleration to the M1 is already running Debian Linux (plus some patches that haven't been upstreamed yet) as her desktop.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,226
136
The M1 is apparently fast enough to handle that in software and still be faster than GPU acceleration on lesser ARM chipsets like Rockchip. Hardly a "big problem", unless your goal is to run CAD software or play demanding games. For a desktop, web browser, office apps etc. it would be fine.

And again, would be do as well, or more likely better just running in a VM.

The main use case for running Linux Natively on an M1, seems to be stating that you have run Linux natively on an M1.
 
  • Like
Reactions: Eug
Jul 27, 2020
16,125
10,187
106
It would be sweetly ironic if the moment the Asahi Linux developers get everything working on M1, Apple sends them a cease and desist letter, arguing that buying their M1 hardware binds them into some agreement precluding reverse engineering of said hardware's proprietary trade secrets. Something like what's being discussed here: Reverse Engineering of Trade Secrets: An important issue you should consider when setting up your innovation protection strategy - Trade Secrets Trends (crowelltradesecretstrends.com)

But if it happens, it would be an interesting lawsuit to follow, much like Epic vs. Apple, except in this case, the threat of overwhelming legal fees may dissuade the Asahi Linux contributors from any further M1 related work. I know I must be wildly imagining stuff but if this comes to pass, can Apple force the Linux Foundation to remove all M1 related patches from the kernel?
 

Mopetar

Diamond Member
Jan 31, 2011
7,826
5,969
136
It would be sweetly ironic if the moment the Asahi Linux developers get everything working on M1, Apple sends them a cease and desist letter, arguing that buying their M1 hardware binds them into some agreement precluding reverse engineering of said hardware's proprietary trade secrets. Something like what's being discussed here: Reverse Engineering of Trade Secrets: An important issue you should consider when setting up your innovation protection strategy - Trade Secrets Trends (crowelltradesecretstrends.com)

But if it happens, it would be an interesting lawsuit to follow, much like Epic vs. Apple, except in this case, the threat of overwhelming legal fees may dissuade the Asahi Linux contributors from any further M1 related work. I know I must be wildly imagining stuff but if this comes to pass, can Apple force the Linux Foundation to remove all M1 related patches from the kernel?

Seems unlikely. I guess I don't know where the team is located, but US case law on reverse engineering is pretty clear.

Apple won't care if someone runs Linux on Mac hardware. They make money either way and the people who want to run Linux on an M1 are a small group that probably includes people who wouldn't otherwise buy Apple hardware. Apple just doesn't want to bear the cost of providing any kind of support to anyone running anything other than an Apple OS on their hardware.
 
Jul 27, 2020
16,125
10,187
106
Apple won't care if someone runs Linux on Mac hardware. They make money either way...
One could argue that they stand to lose money they would otherwise have made from the sale of MacOS software and the Linux effort prevented such sales from being realized. I think Sony similarly blocked Linux on the PS3 because it was preventing thousands of their hardware units from being used for the intended purpose of boosting their software sales. Yes, I'm aware that preventing jailbreak was likely a much bigger factor in Sony's decision.
 

Schmide

Diamond Member
Mar 7, 2002
5,586
718
126
The M1 is apparently fast enough to handle that in software and still be faster than GPU acceleration on lesser ARM chipsets like Rockchip.

This sounds extremely dubious to me. Mali GPUs while not super are hardly slouches and should run circles around any CPU emulation. A high end CPU gets about 5 GFlops per core. Even doubling that for the M1 being a great chip and ignoring big little, that's an upper bound of 80 GFlops. A Mali T-860 gets 11-24 GFlops per core and they come with 16 cores. (edit: Looked it up the Rockchip 3399 has 4 so that would be 80 GFlops) So 176-384 GFlops. That is dwarfed by the M1's 2.7 TFlop GPU though.
 
Last edited:
Jul 27, 2020
16,125
10,187
106
This sounds extremely dubious to me. Mali GPUs while not super are hardly slouches and should run circles around any CPU emulation. A high end CPU gets about 5 GFlops per core. Even doubling that for the M1 being a great chip and ignoring big little, that's an upper bound of 80 GFlops. A Mali T-860 gets 11-24 GFlops per core and they come with 16 cores. So 176-384 GFlops. That is dwarfed by the M1's 2.7 TFlop GPU though.
Your argument holds true for 3D acceleration, but Doug is probably referring to 2D acceleration of GUI elements, for which software acceleration, especially with M1's unified memory, would likely be more than enough.
 

Schmide

Diamond Member
Mar 7, 2002
5,586
718
126
Your argument holds true for 3D acceleration, but Doug is probably referring to 2D acceleration of GUI elements, for which software acceleration, especially with M1's unified memory, would likely be more than enough.

The argument holds true for 2D as well. The statement was not "more than enough" it was "faster than."
 
Jul 27, 2020
16,125
10,187
106
The argument holds true for 2D as well. The statement was not "more than enough" it was "faster than."
Well, we would need to benchmark a Rockchip GPU to be sure. However, from personal experience (specifically after wasting almost $250 on a French manufactured Android tablet using a Rockchip SoC), I would put Rockchip in the "uselessly slow silicon" category.
 

Schmide

Diamond Member
Mar 7, 2002
5,586
718
126
Well, we would need to benchmark a Rockchip GPU to be sure. However, from personal experience (specifically after wasting almost $250 on a French manufactured Android tablet using a Rockchip SoC), I would put Rockchip in the "uselessly slow silicon" category.

bad programming != silicon

The point sill stands and any comparison of a CPU to a GPU except extreme edge cases.
 
Jul 27, 2020
16,125
10,187
106
bad programming != silicon
My experience was with an Android tablet containing Rockchip silicon. Android's code is written by Google programmers so the fault was definitely with the silicon. But yes, I do agree that in general, a half decent GPU would definitely whoop a CPU's arse in GPU intensive tasks.
 

Eug

Lifer
Mar 11, 2000
23,586
1,000
126
I think Sony similarly blocked Linux on the PS3 because it was preventing thousands of their hardware units from being used for the intended purpose of boosting their software sales. Yes, I'm aware that preventing jailbreak was likely a much bigger factor in Sony's decision.
PS3 was a loss leader. They didn't even break even on the PS3 from hardware sales.

The M1 Mac mini is quite profitable. I don't know specific numbers about the Mac mini in particular, but in general Apple's gross margin on their hardware is roughly around 40-45%.
 
  • Like
Reactions: moinmoin

moinmoin

Diamond Member
Jun 1, 2017
4,944
7,656
136
It would be sweetly ironic if the moment the Asahi Linux developers get everything working on M1, Apple sends them a cease and desist letter, arguing that buying their M1 hardware binds them into some agreement precluding reverse engineering of said hardware's proprietary trade secrets.
Since everything being reverse engineered is already being integrated in the mainline Linux kernel as we speak, that approach would make Apple face the ill feeling of pretty much the whole FOSS scene including all the involved major companies. E.g. Microsoft uses the Linux kernel as part of its WSL on Windows etc. I doubt Apple is that reckless to wait with legal demands like this until after the fact (aside the fact that in many jurisdictions courts keep ruling that clean room reverse engineering is perfectly lawful).
 

Doug S

Platinum Member
Feb 8, 2020
2,235
3,453
136
One could argue that they stand to lose money they would otherwise have made from the sale of MacOS software and the Linux effort prevented such sales from being realized. I think Sony similarly blocked Linux on the PS3 because it was preventing thousands of their hardware units from being used for the intended purpose of boosting their software sales. Yes, I'm aware that preventing jailbreak was likely a much bigger factor in Sony's decision.

Unless you buy from the Mac store (which like the Microsoft store sees little use) Apple doesn't make anything when you buy Mac software so it is probably a very small difference in percentage terms.

Even if a buyer using it to run Linux is less profitable, why would you stop someone from becoming a "still profitable just not as profitable" customer? If I can sell widgets that make me either $1 or $2 in profit depending on how it is used by the buyer, what possible incentive would have I have to stop those who make me only $1 in profit from buying widgets from me? I'd rather sell more widgets and make more money! You'd have to be stupid to do otherwise.
 
Jul 27, 2020
16,125
10,187
106
Unless you buy from the Mac store (which like the Microsoft store sees little use) Apple doesn't make anything when you buy Mac software so it is probably a very small difference in percentage terms.

Even if a buyer using it to run Linux is less profitable, why would you stop someone from becoming a "still profitable just not as profitable" customer? If I can sell widgets that make me either $1 or $2 in profit depending on how it is used by the buyer, what possible incentive would have I have to stop those who make me only $1 in profit from buying widgets from me? I'd rather sell more widgets and make more money! You'd have to be stupid to do otherwise.
Going by that reasoning, Apple would do well to cater to the niche use cases of its hardware by being as supportive as it can be. Other than Boot Camp, I don't think Apple has ever demonstrated that it cares about the relatively few who love its hardware but not its software. At least, I can't remember otherwise.
 

Doug S

Platinum Member
Feb 8, 2020
2,235
3,453
136
Going by that reasoning, Apple would do well to cater to the niche use cases of its hardware by being as supportive as it can be. Other than Boot Camp, I don't think Apple has ever demonstrated that it cares about the relatively few who love its hardware but not its software. At least, I can't remember otherwise.

Supporting it costs resources, for what would never be anything other than a tiny market. I can easily see the most logical conclusion is "don't waste time offering any help, but don't do anything to prevent people who want to invest their own time making it happen".

There's also risk involved. Let's say Apple provided very detailed information about their GPU, and Nvidia's lawyers look it over and says "hey, it looks like they've arrived at some of the same answers our engineers did a few years ago, and we filed patents for!". Then Apple gets slapped with a patent lawsuit and couldn't sell enough Macs to Linux enthusiasts to make that up in a century.
 
  • Like
Reactions: Eug and scannall
Jul 27, 2020
16,125
10,187
106
There's also risk involved. Let's say Apple provided very detailed information about their GPU, and Nvidia's lawyers look it over and says "hey, it looks like they've arrived at some of the same answers our engineers did a few years ago, and we filed patents for!". Then Apple gets slapped with a patent lawsuit and couldn't sell enough Macs to Linux enthusiasts to make that up in a century.
I was under the impression that companies ensure that they have licenses for the tech used in their products to prevent patent litigation. The license could be from someone other than a major player and might implement things just a little bit differently to be deemed as a non violation of an existing patent. But yeah, Nvidia could do that in a strategic sense, just to settle with Apple with one of the stipulations being that Apple has to use their GPUs at a price convenient to Nvidia.