Discussion Apple Silicon SoC thread

Page 102 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,587
1,001
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:

 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
Any contributions to that thread would be appreciated. I'm pretty sure handbrake may support some of the hw acceleration available under M1 Max (or if it doesn't now, it may in the future), but for the sake of gathering data, I'm pretty sure that can be disabled/bypassed.

Handbrake benchmarks would point to the current condition of x265 ports, not the overall strength of the CPU cores.

Since the x265 code has significant performance enhancements done in hand tuned x86 assembler, that has no equivalent for ARM processors.
 

naukkis

Senior member
Jun 5, 2002
706
578
136
Handbrake benchmarks would point to the current condition of x265 ports, not the overall strength of the CPU cores.

Since the x265 code has significant performance enhancements done in hand tuned x86 assembler, that has no equivalent for ARM processors.

There's Apple provided Neon-patch which apparently increases speed by about 3x. But it isn't implemented in main line yet, but development builds are usable.
 

The Hardcard

Member
Oct 19, 2021
46
38
51
That's possible. Anything beyond that observation is speculation, but the M1 Max in particular has a lot of GPU resources available . . .

@The Hardcard

Having problems picking data out of those vids. The first one may be using hw acceleration and the second one just has a M1 Max running solo without any comparative data I could find. Doesn't someone just have some handbrake benchmark runs on their M1 Max or similar?

For anyone that has or is getting one of the new Macbook Pros, we have a handbrake thread in this subforum:


Any contributions to that thread would be appreciated. I'm pretty sure handbrake may support some of the hw acceleration available under M1 Max (or if it doesn't now, it may in the future), but for the sake of gathering data, I'm pretty sure that can be disabled/bypassed.

I also want to see the data, however, those videos show editors having an improved experience working timelines that include multiple codecs that are not hardware accelerated. That would sell the laptops regardless of accelerators. Editors who don’t ever use ProRes are buying these.
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
Since it contains code from your company, what makes you so confident it‘s not your fault?


You mean Cinebench? If you want to be trusted you should provide some more data because your track record doesn’t inspire confidence. So what is that open source sw stack that doesn’t run well on Arm? What is that software that doesn’t run at all on an Arm Mac? Obviously these things exist, but claims with no evidence can just be discarded and are useless.

My track record? I wasn’t the guy claiming Apple’s hardware was faster than a 3080 or 3090. I wasn’t the guy claiming the M1
Pro/Max was 500% more efficient than AMD/Intel.

I don’t test Apple hardware regularly, so how am I supposed to know when a large company gets off their butts and ports a piece of software over, much less a benchmark for a piece of software I don’t use.

My track record on this forum regarding performance estimates, predictions, etc. is pretty good because I dig through the hype and look at the data. I am not going to comment on your track record, however.

Don’t bother replying. Putting you on ignore.
 

Nothingness

Platinum Member
Jul 3, 2013
2,423
754
136
I don’t test Apple hardware regularly, so how am I supposed to know when a large company gets off their butts and ports a piece of software over, much less a benchmark for a piece of software I don’t use.
And this didn’t prevent you from writing something without even admitting your mistake later. Now you had the opportunity to make your point that M1 is not good for your needs, but didn’t.

Don’t bother replying. Putting you on ignore.
How handy to put on ignore someone who asked you to sustain your claims. The conclusion is obvious about your unproven claims I guess. Too bad, I was honestly curious to know more.
 
Jul 27, 2020
16,342
10,354
106
If that's the case then eventually some decent numbers should come out. Sounds like a job for Phoronix tbh. Their M1 review(s) had a lot of frustrating examples of benchmarks run with poorly-optimized executables or Rosetta though.
His M1 news piece said that a review will be forthcoming. But the M1 Pro/Max news piece says nothing about a review and I don't think he's willing to shell out $$$ for the expensive hardware. Poor guy is pretty underfunded and asking for donations from time to time.
 

Nothingness

Platinum Member
Jul 3, 2013
2,423
754
136
If that's the case then eventually some decent numbers should come out. Sounds like a job for Phoronix tbh. Their M1 review(s) had a lot of frustrating examples of benchmarks run with poorly-optimized executables or Rosetta though.
That’s a mistake they have often done alas and not only for M1. OTOH it’s hard to blame them when something runs poorly out of the box. But this also means drawing conclusion on their results about CPU performance is difficult.

BTW I found these HandBrake results:


Given the scaling between M1 and M1 Pro/Max, I guess this is CPU based. But that’s just guessing :)
 

dmens

Platinum Member
Mar 18, 2005
2,271
917
136
How handy to put on ignore someone who asked you to sustain your claims. The conclusion is obvious about your unproven claims I guess. Too bad, I was honestly curious to know more.

Hah. This guy claimed Apple made a huge mistake moving to custom silicon instead of sticking with x86, a week before M1 Pro/Max was announced, then poo-poohed all the benchmarks because *his* software didn't run on ARM. He also proclaimed Intel had a great/fantastic chip with Alderlake before there is a single credible review is out, making wild claims about efficiency gains because of "process scaling", before Intel decided to just say PL1 = PL2 and set Alderlake 8+8 to 241 watts. Just yesterday he said Alderlake will be more power efficient than Zen 3. LOL.

Since we are on an Apple thread, I am going to put a stake in the ground and say that the Alderlake "efficiency" Atom cores (if you can even call it that) consume over 5 watts to produce benchmarking results. That is more power than an Apple Firestorm core is allowed to consume. This guy claimed Alderlake Atoms will lead the way on x86 power efficiency. So... not a good sign for his camp.
 

Eug

Lifer
Mar 11, 2000
23,587
1,001
126
M1 Max wipes the floor with RTX 3080* in gaming.


*The RTX 3080 is a lower power laptop version, and is thermally constrained... but that's because it's in a form factor similar to the MacBook Pro. M1 Max is not thermally constrained in that form factor.
 
  • Like
Reactions: IntelCeleron

DRC_40

Junior Member
Sep 25, 2012
18
14
81
Hah. This guy claimed Apple made a huge mistake moving to custom silicon instead of sticking with x86, a week before M1 Pro/Max was announced, then poo-poohed all the benchmarks because *his* software didn't run on ARM. He also proclaimed Intel had a great/fantastic chip with Alderlake before there is a single credible review is out, making wild claims about efficiency gains because of "process scaling", before Intel decided to just say PL1 = PL2 and set Alderlake 8+8 to 241 watts. Just yesterday he said Alderlake will be more power efficient than Zen 3. LOL.

Since we are on an Apple thread, I am going to put a stake in the ground and say that the Alderlake "efficiency" Atom cores (if you can even call it that) consume over 5 watts to produce benchmarking results. That is more power than an Apple Firestorm core is allowed to consume. This guy claimed Alderlake Atoms will lead the way on x86 power efficiency. So... not a good sign for his camp.

There’s a lot of people who hate to admit Apple has made the gains they have. Heck, ive hated Apple computers for years myself. Too expensive, lack of upgradability, closed eco system, etc. There’s plenty of reasons to dislike their practices. They’ve finally made a suitable product for a bigger market share than they’ve had and it’s tough to get on their side. I’m still reading innuendo that is focused completely on disproving their gains and looking for the “I got ya” piece of data to say “I told ya so”. I get it but even begrudgingly one had to admit the obvious. Cheers…
 

DrMrLordX

Lifer
Apr 27, 2000
21,643
10,862
136
His M1 news piece said that a review will be forthcoming. But the M1 Pro/Max news piece says nothing about a review and I don't think he's willing to shell out $$$ for the expensive hardware. Poor guy is pretty underfunded and asking for donations from time to time.

Apple probably didn't send him a review sample either. Definitely not after that M1 article.

That’s a mistake they have often done alas and not only for M1. OTOH it’s hard to blame them when something runs poorly out of the box. But this also means drawing conclusion on their results about CPU performance is difficult.

Pretty much. FOSS on MacOS is hit-or-miss.

BTW I found these HandBrake results:

From what I've read (and I need to do more reading), there's specific encoder settings you need to pick to make sure hw acceleration is active. Without access to the sample file, the version of handbrake used, or anything else, it's hard to tell.
 

dmens

Platinum Member
Mar 18, 2005
2,271
917
136
There’s a lot of people who hate to admit Apple has made the gains they have. Heck, ive hated Apple computers for years myself. Too expensive, lack of upgradability, closed eco system, etc. There’s plenty of reasons to dislike their practices. They’ve finally made a suitable product for a bigger market share than they’ve had and it’s tough to get on their side. I’m still reading innuendo that is focused completely on disproving their gains and looking for the “I got ya” piece of data to say “I told ya so”. I get it but even begrudgingly one had to admit the obvious. Cheers…

Pretty much. Next week the Alderlake reviews will come out and it will win some ST benchmarks and the “I told ya so” will flow. But you can imagine what the results would be if Apple/AMD allowed their big cores to use 35 watts apiece. The key word is allowed, there is no engineering limitation for this, it is purely a decision based on sanity and caring for the environment.

Being impressed by performance unconstrained by power is like being impressed by the ability to go into the BIOS and increasing core voltage. It is the kind of “engineering” done by first time interns after freshman year EE courses.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,766
784
126
After using my M1 Max for the last two days, it makes me wonder how Intel/AMD are going to keep up. This thing is dead quiet even under pretty heavy workloads. Even playing games at 4K it takes some time for it to spin up fans and they're still super quiet.

Is x86 becoming more and more marginalized?
 

Eug

Lifer
Mar 11, 2000
23,587
1,001
126
So, as expected, the claim that macOS on M1 series chips need less RAM than macOS on Intel is false. Well, sort of.

Max did some Lightroom / Photoshop tests on two M1 Pros, one with 16 GB RAM and one 32 GB RAM, both configured identically and running the exact same software.

What he did notice is that even under heavy multitasking both felt very responsive, and noticeably quicker than a similarly configured 2019 Intel Mac with 16 GB RAM.

However, despite the fact that both M1 Pro machines felt fast, the 16 GB machine was hitting the swap a lot, and the 32 GB machine was not.

Screen Shot 2021-10-31 at 3.34.04 PM.png

Then if he went beyond his normal amount of multitasking to something crazier, both hit the swap, but the 16 GB model did much worse:

Screen Shot 2021-10-31 at 3.38.40 PM.png

Despite this, the 16 GB still felt fast.

So if you look at this from a performance perspective, yes M1 series Macs hide swapping well, but the bottom line is that they still benefit significantly from that extra RAM, to reduce SSD wear if you're a heavy user. Also, a lot of users have reported that they run into stability issues when short on RAM, and upgrading to a model with more RAM eliminates those issues.

This may seem like common sense, preaching to the choir, but I am posting this anyway just because there appears to be a commonly believed myth out there that M1 series Macs inherently need less RAM, but as demonstrated, that isn't really true. They just hide insufficient RAM better.
 

Schmide

Diamond Member
Mar 7, 2002
5,587
719
126
M1 Max wipes the floor with RTX 3080* in gaming.


*The RTX 3080 is a lower power laptop version, and is thermally constrained... but that's because it's in a form factor similar to the MacBook Pro. M1 Max is not thermally constrained in that form factor.

He said those were the same settings over and over.

Visually my opinion.
Draw distance was off.
Bloom was off.
HDR was most certainly different which is the reason the nVidia looks so much better, not the nits.
At least 3 artifacts on the MAC in the 3rd scene.

Edit: distance difference
 
Last edited:

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
There’s a lot of people who hate to admit Apple has made the gains they have. Heck, ive hated Apple computers for years myself. Too expensive, lack of upgradability, closed eco system, etc. There’s plenty of reasons to dislike their practices. They’ve finally made a suitable product for a bigger market share than they’ve had and it’s tough to get on their side. I’m still reading innuendo that is focused completely on disproving their gains and looking for the “I got ya” piece of data to say “I told ya so”. I get it but even begrudgingly one had to admit the obvious. Cheers…

I hope you aren't referring to me. Quite frankly, I'm glad they were able to pull off what they did, but too many people here overhyped it and pretends like it's god's answer to everything that is "wrong" with the PC market. There are users getting sucked into this that realize, far too late, that their new shiny Macbook can't do what their old PC did. I'm one of the most objective and unbiased people you will meet when it comes to hardware. I've said both good and bad things about every major company.

If Apple would allow an ARM port of Windows on the Mac and provide GPU drivers for Windows, they'd get a lot more support from many of us.

The concerns I've attempted to state previously are valid:
  1. Limited software support. (macOS only, some proprietary macOS software doesn't work under Rosetta as well, no native Linux dual boot support except third party without driver support)
  2. Overpriced for what you get. (A PC equivalent would cost $1,000 less and run WAY more software, and depending on the software, can run that software faster. I can provide links if needed. Amazon has lots of Ryzen laptops with GeForce graphics)
  3. Performance gains drastically overstated by certain members of this forum repeatedly. (GPU is "around" 3060 level NOT 3080/3090, CPU is NOT 500% more efficient than AMD/Intel, or even 3x-4x like some people here are claiming. Given that Apple is on a newer node and just released a brand new product on that node, being ahead is expected. Zen 4 will easily close the gap. Alder Lake Mobile appears to already beat the M1 Pro Max in terms of raw performance in early benchmarks, and it's on an older node. Translation: I would hope Apple can be competitive with year old technology)
 

dmens

Platinum Member
Mar 18, 2005
2,271
917
136
Note the constantly shifting goalposts.

ADL wins in raw performance!
*But only when the obese core sucks down 7x the power

Apple is not 500% more efficient!
*When the Intel CPU is downvolted, downclocked and performance crippled, and it sure is not Apple’s fault Intel allows their parts to run at that power level

M1 Max is not a 3080!
*Because it it is not like a 3080 can be run at different wattages, heck the Apple graphs state this outright

It’s overpriced, I can show you equivalent PC’s for less!
*But only if raw performance while plugged in on some applications as the sole metric of comparison

I am pretty sure I am already blocked so I don’t expect a response. :D
 

Gideon

Golden Member
Nov 27, 2007
1,646
3,712
136
  1. Overpriced for what you get. (A PC equivalent would cost $1,000 less and run WAY more software, and depending on the software, can run that software faster. I can provide links if needed. Amazon has lots of Ryzen laptops with GeForce graphics)

This has usually been the mantra of macbooks for ages, but this time, I'd argue, it's not actually true in many cases.


There actually aren't that many comparable laptops for these once you factor in the form-factor, build quality, screen, battery life and the rest of the features. Once I factor in build quality and overrall feel I'd say that there really are just two:
  • The closest match to the 16" Macbook is the XPS 15 9510 with the OLED display and it's not cheaper at all . And despite it's many strong points it still has:
    • a very lackluster GPU in the 3050 Ti
    • PCIe 3.0 SSD only
    • DDR4-3200 only with rather poor bandwidth
  • The closest match to the 14" Macbook is the Razer 14 which is also not really cheaper in equivalent trims. And even it still has noticable deficencies:
    • An inferior screen (for anything but gaming)
    • PCIe 3.0 SSD only
    • limited to 16GB memory and still only DDR4-3200MHz (not even LPDDR4X for more bandwidth)
And these are some of the best built compact laptops with very good performance for the form-factor. Still worse in many areas and most imprtantly not really cheaper. You can certainly find "cheap performance" for 1K less but you do ssacrifice heavily on other metrics, so that's not a fair comparison.


Besides, there are niche workloads where the M1 Pro will be loads better than anything comparable. For instance, when you need loads of memory-bandwidth (for the CPU) and/or PCIE 4.0 level SSDs. When you actually need to develop for Graviton/Ampere, etc...

Yes, these are niche workloads but they do exist. You can't just ignore them if it doesn’t suit you.

As an overall package the new macbooks are awesome. They certainly have their flaws and I agree that overall they are very expensive. Compatibility can certainly be an issue as well but IMO you overstate it. I doubt there will be that many straight from windows to “first time mac” conversions - and the usual Mac ecosystem is already quite well supported. If you are a power user ready to shove out 3-4K for a new laptop and too dumb to visit this compatiblity site for 10 seconds (to verify your apps work) you deserve what you get.
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
This may seem like common sense, preaching to the choir, but I am posting this anyway just because there appears to be a commonly believed myth out there that M1 series Macs inherently need less RAM, but as demonstrated, that isn't really true. They just hide insufficient RAM better.

Though, if you really don't notice the swapping, do you really "need" more RAM?
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
There have been persistent (if possibly inaccurate) claims that M1 Macs wear out their SSDs quickly from excess swapping.

IIRC, that was shown as a (now fixed) reporting bug.

My view on this is: If this is a money making machine, then spend money boosting RAM to an excess even if the benefit is imperceptibly small.

But if this just a home/hobby computer, and makes no perceptible performance difference while swapping, then you don't "need" more RAM.

OTOH if your Windows machine starts stuttering while swapping then you "need" more RAM.

So I view the needs less RAM "myth" as not that far off the mark.

M1 Macs don't "use" less RAM, but when it runs out, fast swapping makes the consequences largely irrelevant, so you don't really "need" it.
 
Last edited:
  • Like
Reactions: Etain05 and Viknet

moinmoin

Diamond Member
Jun 1, 2017
4,956
7,676
136
(...) it makes me wonder how Intel/AMD are going to keep up. (...) Is x86 becoming more and more marginalized?
Product quality wise? Looks so. Does Intel need to worry? Unless Apple suddenly goes for the lower budget mass market as well, not really.

So, as expected, the claim that macOS on M1 series chips need less RAM than macOS on Intel is false. Well, sort of.

Max did some Lightroom / Photoshop tests on two M1 Pros, one with 16 GB RAM and one 32 GB RAM, both configured identically and running the exact same software.

What he did notice is that even under heavy multitasking both felt very responsive, and noticeably quicker than a similarly configured 2019 Intel Mac with 16 GB RAM.

However, despite the fact that both M1 Pro machines felt fast, the 16 GB machine was hitting the swap a lot, and the 32 GB machine was not.

View attachment 52171

Then if he went beyond his normal amount of multitasking to something crazier, both hit the swap, but the 16 GB model did much worse:

View attachment 52172

Despite this, the 16 GB still felt fast.

So if you look at this from a performance perspective, yes M1 series Macs hide swapping well, but the bottom line is that they still benefit significantly from that extra RAM, to reduce SSD wear if you're a heavy user. Also, a lot of users have reported that they run into stability issues when short on RAM, and upgrading to a model with more RAM eliminates those issues.

This may seem like common sense, preaching to the choir, but I am posting this anyway just because there appears to be a commonly believed myth out there that M1 series Macs inherently need less RAM, but as demonstrated, that isn't really true. They just hide insufficient RAM better.
Isn't that the issue that first came up with the original M1 launch where people first noticed these aggressive swap usages? I thought the topic was essentially resolved by the mention that usage of the SSD makes sense for improving performance and the wear it introduces is still perfectly well within the SSD's life expectancy?