Discussion Apple Silicon SoC thread

Page 103 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

Nothingness

Diamond Member
Jul 3, 2013
3,090
2,084
136
I don’t test Apple hardware regularly, so how am I supposed to know when a large company gets off their butts and ports a piece of software over, much less a benchmark for a piece of software I don’t use.
And this didn’t prevent you from writing something without even admitting your mistake later. Now you had the opportunity to make your point that M1 is not good for your needs, but didn’t.

Don’t bother replying. Putting you on ignore.
How handy to put on ignore someone who asked you to sustain your claims. The conclusion is obvious about your unproven claims I guess. Too bad, I was honestly curious to know more.
 
Jul 27, 2020
20,040
13,740
146
If that's the case then eventually some decent numbers should come out. Sounds like a job for Phoronix tbh. Their M1 review(s) had a lot of frustrating examples of benchmarks run with poorly-optimized executables or Rosetta though.
His M1 news piece said that a review will be forthcoming. But the M1 Pro/Max news piece says nothing about a review and I don't think he's willing to shell out $$$ for the expensive hardware. Poor guy is pretty underfunded and asking for donations from time to time.
 

Nothingness

Diamond Member
Jul 3, 2013
3,090
2,084
136
If that's the case then eventually some decent numbers should come out. Sounds like a job for Phoronix tbh. Their M1 review(s) had a lot of frustrating examples of benchmarks run with poorly-optimized executables or Rosetta though.
That’s a mistake they have often done alas and not only for M1. OTOH it’s hard to blame them when something runs poorly out of the box. But this also means drawing conclusion on their results about CPU performance is difficult.

BTW I found these HandBrake results:


Given the scaling between M1 and M1 Pro/Max, I guess this is CPU based. But that’s just guessing :)
 

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
M1 Max wipes the floor with RTX 3080* in gaming.


*The RTX 3080 is a lower power laptop version, and is thermally constrained... but that's because it's in a form factor similar to the MacBook Pro. M1 Max is not thermally constrained in that form factor.
 
  • Like
Reactions: IntelCeleron

DRC_40

Junior Member
Sep 25, 2012
18
14
81
Hah. This guy claimed Apple made a huge mistake moving to custom silicon instead of sticking with x86, a week before M1 Pro/Max was announced, then poo-poohed all the benchmarks because *his* software didn't run on ARM. He also proclaimed Intel had a great/fantastic chip with Alderlake before there is a single credible review is out, making wild claims about efficiency gains because of "process scaling", before Intel decided to just say PL1 = PL2 and set Alderlake 8+8 to 241 watts. Just yesterday he said Alderlake will be more power efficient than Zen 3. LOL.

Since we are on an Apple thread, I am going to put a stake in the ground and say that the Alderlake "efficiency" Atom cores (if you can even call it that) consume over 5 watts to produce benchmarking results. That is more power than an Apple Firestorm core is allowed to consume. This guy claimed Alderlake Atoms will lead the way on x86 power efficiency. So... not a good sign for his camp.

There’s a lot of people who hate to admit Apple has made the gains they have. Heck, ive hated Apple computers for years myself. Too expensive, lack of upgradability, closed eco system, etc. There’s plenty of reasons to dislike their practices. They’ve finally made a suitable product for a bigger market share than they’ve had and it’s tough to get on their side. I’m still reading innuendo that is focused completely on disproving their gains and looking for the “I got ya” piece of data to say “I told ya so”. I get it but even begrudgingly one had to admit the obvious. Cheers…
 

DrMrLordX

Lifer
Apr 27, 2000
22,065
11,695
136
His M1 news piece said that a review will be forthcoming. But the M1 Pro/Max news piece says nothing about a review and I don't think he's willing to shell out $$$ for the expensive hardware. Poor guy is pretty underfunded and asking for donations from time to time.

Apple probably didn't send him a review sample either. Definitely not after that M1 article.

That’s a mistake they have often done alas and not only for M1. OTOH it’s hard to blame them when something runs poorly out of the box. But this also means drawing conclusion on their results about CPU performance is difficult.

Pretty much. FOSS on MacOS is hit-or-miss.

BTW I found these HandBrake results:

From what I've read (and I need to do more reading), there's specific encoder settings you need to pick to make sure hw acceleration is active. Without access to the sample file, the version of handbrake used, or anything else, it's hard to tell.
 

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
There’s a lot of people who hate to admit Apple has made the gains they have. Heck, ive hated Apple computers for years myself. Too expensive, lack of upgradability, closed eco system, etc. There’s plenty of reasons to dislike their practices. They’ve finally made a suitable product for a bigger market share than they’ve had and it’s tough to get on their side. I’m still reading innuendo that is focused completely on disproving their gains and looking for the “I got ya” piece of data to say “I told ya so”. I get it but even begrudgingly one had to admit the obvious. Cheers…

Pretty much. Next week the Alderlake reviews will come out and it will win some ST benchmarks and the “I told ya so” will flow. But you can imagine what the results would be if Apple/AMD allowed their big cores to use 35 watts apiece. The key word is allowed, there is no engineering limitation for this, it is purely a decision based on sanity and caring for the environment.

Being impressed by performance unconstrained by power is like being impressed by the ability to go into the BIOS and increasing core voltage. It is the kind of “engineering” done by first time interns after freshman year EE courses.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,886
1,103
126
After using my M1 Max for the last two days, it makes me wonder how Intel/AMD are going to keep up. This thing is dead quiet even under pretty heavy workloads. Even playing games at 4K it takes some time for it to spin up fans and they're still super quiet.

Is x86 becoming more and more marginalized?
 

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
So, as expected, the claim that macOS on M1 series chips need less RAM than macOS on Intel is false. Well, sort of.

Max did some Lightroom / Photoshop tests on two M1 Pros, one with 16 GB RAM and one 32 GB RAM, both configured identically and running the exact same software.

What he did notice is that even under heavy multitasking both felt very responsive, and noticeably quicker than a similarly configured 2019 Intel Mac with 16 GB RAM.

However, despite the fact that both M1 Pro machines felt fast, the 16 GB machine was hitting the swap a lot, and the 32 GB machine was not.

Screen Shot 2021-10-31 at 3.34.04 PM.png

Then if he went beyond his normal amount of multitasking to something crazier, both hit the swap, but the 16 GB model did much worse:

Screen Shot 2021-10-31 at 3.38.40 PM.png

Despite this, the 16 GB still felt fast.

So if you look at this from a performance perspective, yes M1 series Macs hide swapping well, but the bottom line is that they still benefit significantly from that extra RAM, to reduce SSD wear if you're a heavy user. Also, a lot of users have reported that they run into stability issues when short on RAM, and upgrading to a model with more RAM eliminates those issues.

This may seem like common sense, preaching to the choir, but I am posting this anyway just because there appears to be a commonly believed myth out there that M1 series Macs inherently need less RAM, but as demonstrated, that isn't really true. They just hide insufficient RAM better.
 

Schmide

Diamond Member
Mar 7, 2002
5,596
730
126
M1 Max wipes the floor with RTX 3080* in gaming.


*The RTX 3080 is a lower power laptop version, and is thermally constrained... but that's because it's in a form factor similar to the MacBook Pro. M1 Max is not thermally constrained in that form factor.

He said those were the same settings over and over.

Visually my opinion.
Draw distance was off.
Bloom was off.
HDR was most certainly different which is the reason the nVidia looks so much better, not the nits.
At least 3 artifacts on the MAC in the 3rd scene.

Edit: distance difference
 
Last edited:

dmens

Platinum Member
Mar 18, 2005
2,275
965
136
Note the constantly shifting goalposts.

ADL wins in raw performance!
*But only when the obese core sucks down 7x the power

Apple is not 500% more efficient!
*When the Intel CPU is downvolted, downclocked and performance crippled, and it sure is not Apple’s fault Intel allows their parts to run at that power level

M1 Max is not a 3080!
*Because it it is not like a 3080 can be run at different wattages, heck the Apple graphs state this outright

It’s overpriced, I can show you equivalent PC’s for less!
*But only if raw performance while plugged in on some applications as the sole metric of comparison

I am pretty sure I am already blocked so I don’t expect a response. :D
 

Gideon

Golden Member
Nov 27, 2007
1,774
4,145
136
  1. Overpriced for what you get. (A PC equivalent would cost $1,000 less and run WAY more software, and depending on the software, can run that software faster. I can provide links if needed. Amazon has lots of Ryzen laptops with GeForce graphics)

This has usually been the mantra of macbooks for ages, but this time, I'd argue, it's not actually true in many cases.


There actually aren't that many comparable laptops for these once you factor in the form-factor, build quality, screen, battery life and the rest of the features. Once I factor in build quality and overrall feel I'd say that there really are just two:
  • The closest match to the 16" Macbook is the XPS 15 9510 with the OLED display and it's not cheaper at all . And despite it's many strong points it still has:
    • a very lackluster GPU in the 3050 Ti
    • PCIe 3.0 SSD only
    • DDR4-3200 only with rather poor bandwidth
  • The closest match to the 14" Macbook is the Razer 14 which is also not really cheaper in equivalent trims. And even it still has noticable deficencies:
    • An inferior screen (for anything but gaming)
    • PCIe 3.0 SSD only
    • limited to 16GB memory and still only DDR4-3200MHz (not even LPDDR4X for more bandwidth)
And these are some of the best built compact laptops with very good performance for the form-factor. Still worse in many areas and most imprtantly not really cheaper. You can certainly find "cheap performance" for 1K less but you do ssacrifice heavily on other metrics, so that's not a fair comparison.


Besides, there are niche workloads where the M1 Pro will be loads better than anything comparable. For instance, when you need loads of memory-bandwidth (for the CPU) and/or PCIE 4.0 level SSDs. When you actually need to develop for Graviton/Ampere, etc...

Yes, these are niche workloads but they do exist. You can't just ignore them if it doesn’t suit you.

As an overall package the new macbooks are awesome. They certainly have their flaws and I agree that overall they are very expensive. Compatibility can certainly be an issue as well but IMO you overstate it. I doubt there will be that many straight from windows to “first time mac” conversions - and the usual Mac ecosystem is already quite well supported. If you are a power user ready to shove out 3-4K for a new laptop and too dumb to visit this compatiblity site for 10 seconds (to verify your apps work) you deserve what you get.
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
This may seem like common sense, preaching to the choir, but I am posting this anyway just because there appears to be a commonly believed myth out there that M1 series Macs inherently need less RAM, but as demonstrated, that isn't really true. They just hide insufficient RAM better.

Though, if you really don't notice the swapping, do you really "need" more RAM?
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
There have been persistent (if possibly inaccurate) claims that M1 Macs wear out their SSDs quickly from excess swapping.

IIRC, that was shown as a (now fixed) reporting bug.

My view on this is: If this is a money making machine, then spend money boosting RAM to an excess even if the benefit is imperceptibly small.

But if this just a home/hobby computer, and makes no perceptible performance difference while swapping, then you don't "need" more RAM.

OTOH if your Windows machine starts stuttering while swapping then you "need" more RAM.

So I view the needs less RAM "myth" as not that far off the mark.

M1 Macs don't "use" less RAM, but when it runs out, fast swapping makes the consequences largely irrelevant, so you don't really "need" it.
 
Last edited:
  • Like
Reactions: Etain05 and Viknet

moinmoin

Diamond Member
Jun 1, 2017
5,064
8,032
136
(...) it makes me wonder how Intel/AMD are going to keep up. (...) Is x86 becoming more and more marginalized?
Product quality wise? Looks so. Does Intel need to worry? Unless Apple suddenly goes for the lower budget mass market as well, not really.

So, as expected, the claim that macOS on M1 series chips need less RAM than macOS on Intel is false. Well, sort of.

Max did some Lightroom / Photoshop tests on two M1 Pros, one with 16 GB RAM and one 32 GB RAM, both configured identically and running the exact same software.

What he did notice is that even under heavy multitasking both felt very responsive, and noticeably quicker than a similarly configured 2019 Intel Mac with 16 GB RAM.

However, despite the fact that both M1 Pro machines felt fast, the 16 GB machine was hitting the swap a lot, and the 32 GB machine was not.

View attachment 52171

Then if he went beyond his normal amount of multitasking to something crazier, both hit the swap, but the 16 GB model did much worse:

View attachment 52172

Despite this, the 16 GB still felt fast.

So if you look at this from a performance perspective, yes M1 series Macs hide swapping well, but the bottom line is that they still benefit significantly from that extra RAM, to reduce SSD wear if you're a heavy user. Also, a lot of users have reported that they run into stability issues when short on RAM, and upgrading to a model with more RAM eliminates those issues.

This may seem like common sense, preaching to the choir, but I am posting this anyway just because there appears to be a commonly believed myth out there that M1 series Macs inherently need less RAM, but as demonstrated, that isn't really true. They just hide insufficient RAM better.
Isn't that the issue that first came up with the original M1 launch where people first noticed these aggressive swap usages? I thought the topic was essentially resolved by the mention that usage of the SSD makes sense for improving performance and the wear it introduces is still perfectly well within the SSD's life expectancy?
 

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
IIRC, that was shown as a (now fixed) reporting bug.

My view on this is: If this is a money making machine, then spend money boosting RAM to an excess even if the benefit is imperceptibly small.

But if this just a home/hobby computer, and makes no perceptible performance difference while swapping, then you don't "need" more RAM.

OTOH if your Windows machine starts stuttering while swapping then you "need" more RAM.

So I view the needs less RAM "myth" as not that far off the mark.

M1 Macs don't "use" less RAM, but when it runs out, fast swapping makes the consequences largely irrelevant, so you don't really "need" it.
However the other problem reported is that some users had been getting occasional random crashes on 8 GB machines when pushed. These crashes disappeared after they upgraded to 16 GB.

Many of these reports were not made until later on, after the initial reviews had already come out. That makes sense since people got a better feel for their systems after weeks of real world usage, and started noticing occasional random things like this.

I’m guessing the same may prove true for 16 GB users who should be on 32 GB.
 

Eug

Lifer
Mar 11, 2000
23,825
1,396
126
Product quality wise? Looks so. Does Intel need to worry? Unless Apple suddenly goes for the lower budget mass market as well, not really.


Isn't that the issue that first came up with the original M1 launch where people first noticed these aggressive swap usages? I thought the topic was essentially resolved by the mention that usage of the SSD makes sense for improving performance and the wear it introduces is still perfectly well within the SSD's life expectancy?
1. See my post above about crashes.

2. It depends on just how many writes you’re actually doing. If you’re just very occasionally swapping heavily then I wouldn’t worry about. However, if this is a daily driver and you’re constantly on yellow memory pressure, I don’t think that’s a good idea.

3. Memory requirements will increase with time. This is true for both the the OS and with software applications, so you're already near capacity now, it's going to be an even bigger problem later, that is if you plan on keeping your machine a long time.
 
Last edited:
  • Like
Reactions: Gideon

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
Isn't that the issue that first came up with the original M1 launch where people first noticed these aggressive swap usages? I thought the topic was essentially resolved by the mention that usage of the SSD makes sense for improving performance and the wear it introduces is still perfectly well within the SSD's life expectancy?

It was a bug, fixed in 11.4 update:
 
  • Like
Reactions: Viknet and moinmoin

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
However the other problem reported is that some users had been getting occasional random crashes on 8 GB machines when pushed. These crashes disappeared after they upgraded to 16 GB.

If you are crashing because you are using swap, that's a bug that needs to be fixed.

I'm on an ancient 6 GB machine. I live in swap, yet my PC doesn't crash.
 
  • Like
Reactions: podspi

jamescox

Senior member
Nov 11, 2009
644
1,105
136
IIRC, that was shown as a (now fixed) reporting bug.

My view on this is: If this is a money making machine, then spend money boosting RAM to an excess even if the benefit is imperceptibly small.

But if this just a home/hobby computer, and makes no perceptible performance difference while swapping, then you don't "need" more RAM.

OTOH if your Windows machine starts stuttering while swapping then you "need" more RAM.

So I view the needs less RAM "myth" as not that far off the mark.

M1 Macs don't "use" less RAM, but when it runs out, fast swapping makes the consequences largely irrelevant, so you don't really "need" it.
If you are swapping a lot then there is almost no way that can happen without causing increased wear on the SSD. I probably would not purchase a laptop now with less than 32 GB expandability. What I would like is a powerful APU with a stack or two of HBM memory as HBM cache integrated. That should then be backed up by some LPDDR5, preferably as standard DIMMs. Even 8 or 16 GB of HBM cache backed up by some LPDDR5 would probably turn out exceptional performance and power consumption.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
If you are swapping a lot then there is almost no way that can happen without causing increased wear on the SSD.



Obviously if you are swapping a lot, then you use cause more SSD wear than not swapping.

But that doesn't mean you are on a trajectory to wear out your SSD in a couple of years. There was a reporting bug making it look like excess wear was happening, that would lead to early wear out.

It's very unlikely home users are going to wear out a high quality Apple SSD on a MBP, and I already said for a money making machine you go for the extra RAM.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
AMD and intel make server CPUs and scale them down with all the downside. A server CPU doesn't need all these accelerators for encode/decode, dsps etc. hence they are missing and hence no support for them really in Windows software. Other thing is legacy compatibility. Why x86 is so cool. But why it is also being held back. Maybe this can trigger of both AMD and intel dropping at least the biggest baggage mostly hindering their designs.

It's not just x86. They are kicking ass in GPUs as well.