Discussion Apple Silicon SoC thread

Page 88 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,587
1,001
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:

 
Last edited:

Doug S

Platinum Member
Feb 8, 2020
2,267
3,519
136
Or it could be the HDMI 2.0 ports:


(okay okay, it's not really related to M1X but still)

Only thing that's making me arch an eyebrow here is the ST performance. It hasn't moved much since M1. A15 didn't move the bar much either (versus A14). It's like Apple is giving Qualcomm and AMD time to play catchup.

Yep that's another example of people focusing on meaningless stuff. You have Thunderbolt ports capable of outputting 4Kp120 if you actually need it. They only brought back HDMI as a nod towards stuff like using your Mac for presentations without carrying an adapter around, that's what people mostly complained about when Ive removed the HDMI port.

Based on the name and everything we know so far this is just an extended M1, and didn't use anything from A15 like the improved little core. Apple may have devoted most of their engineering resources toward the design of M1 and M1 Pro / M1 Max and that's why A15's big core didn't get an update.

We'll have to see if this is a one off or a change in direction where they don't update everything every year like we've become used to. Back when they could get generational gains of 20-40% that was totally worth it. But the higher IPC climbs the more difficult further gains become (and you get less and less from process each generation too) so maybe rather than give us smaller gains every year they will devote engineering effort towards every other year updates that are more meaningful.

If they went to an every other year cadence it wouldn't necessarily mean half the time nothing changes. Maybe one year you get a new big core and improved IPU, the next you get a new little core and new GPU core, etc. Based on Apple's release cadence for the Mac line in the past, I think we probably see a new "M" every other year, not yearly, even if they continue to do generational updates for the "A" and the A15 recycling A14's big core is just an aberration caused by all the effort around making sure the first gen of ARM Macs went off without a hitch.
 
  • Like
Reactions: scannall and Tlh97

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Just keep in mind that the M1 Max would not have designed and manufactured on 7N instead of 5N.
TSMC's charts show a density increase of 1.8 (max, much much less for the large amount of SRAM on the M1 line up) for 5N. The M1 Max, in particular, would have been unreasonably large on 7N.

Counterpoint: The primary reason the M1 Max may not work on 7N has nothing to do with the CPU side, rather, it has to do with making an iGPU that (at least in theory) matches the performance of the very best mobile GPUs...on less than half the power budget. In terms of iGPUs for PCs, this is in a completely different galaxy to anything that has come before in terms of design.

The actual 'CPU' parts of the M1 don't take up much area at all. From https://semianalysis.com/apple-a14-...sis-terrifying-implications-for-the-industry/, 2x Firestorm + 8MB shared L2 (Architecture has no L3) takes up ~7.5mm^2 in the A14, given that the M1 Pro is most likely 8x Firestorm + 24MB shared L2, that's ballpark around ~30mm^2 for the performance cluster + ~3-4mm^2 for the efficiency cluster.

Even if we arbitrarily "assign" 32MB SLC (~15mm^2) to the CPU as cache (dubious reasoning given other parts of the SoC i.e GPU can directly access and benefit from the cache in ways that a normal CPU L3 cannot), that'd still be around 50mm^2 combined for the whole thing on 5nm. The majority being SRAM that (in your own words) doesn't shrink well for smaller nodes, and conversely also doesn't gain lots of size in bigger nodes.

In comparison:

~53mm^2 for Cezanne 8 core complex w/ 16MB L3 at TSMC 7nm.

~43mm^2 for TGL 4 cores w/ L3 at Intel 10nm.

Even accounting for the node advantage, which probably isn't that large density wise given all the cache, the Apple architectures seem to do just fine in terms of area efficiency. It'd almost certainly look diminutive next to Golden Cove, the architecture that may get anywhere close in IPC.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,842
5,994
136
But yeah, that what i was thinking, gaming is not Apple priority, in fact they dont care AT ALL, but they now have some very interesting SoC that allows them to enter that market, if they wish.

Apple has had such an SoC for years now in their A-series chips that they put in their iDevices. I'm fairly certain that whatever they had 3 or 4 years ago could stand up to the Switch which is just using some Tegra chip.

So, I've been looking for someone to say this, and I haven't seen it yet...

Will it mine? With 400GB/sec of memory bandwidth, and with it just sipping power, these devices have the makings of VERY efficient coin miners.

400 GB/s is only on par with a 3060 so the initial investment for a MacBook Pro is significantly higher than it would be for a discrete card. It probably holds resale value a lot better than a GPU and wouldn't see the same kind of price collapse as used supply floods the market in nearly the same way, but I doubt many miners would look at it as a viable alternative.

I know that there were miners buying laptops with gaming cards in them to mine, but how many of those were expensive high-end premium products with Apple levels of markup?

Best case scenario is that you can mine with your MacBook Pro while you're not using it and help recoup some of the cost. I don't think the ROI is good enough to otherwise buy them for the express purpose of mining, especially if some of the Chinese miners are in the process of getting out of the game and selling their own cards in bulk.

The best news is that a MacBook Pro is effectively no more expensive than a similarly configured high-end gaming PC. I looked at what Alienware had the other day and their 17" model comes with a 3070, 32 GB of RAM, and a 1 TB SSD. It's only about $200 less expensive and in terms of raw performance the M1 Max should beat a mobile 3070.
 

Eug

Lifer
Mar 11, 2000
23,587
1,001
126

Ajay

Lifer
Jan 8, 2001
15,458
7,862
136
Counterpoint: The primary reason the M1 Max may not work on 7N has nothing to do with the CPU side, rather, it has to do with making an iGPU that (at least in theory) matches the performance of the very best mobile GPUs...on less than half the power budget. In terms of iGPUs for PCs, this is in a completely different galaxy to anything that has come before in terms of design.
I should have been clearer. My point is that Apple likely wouldn't have implemented the M1 Max SoC on 7N - it would have been quite a bit larger and had very different characteristics which are impossible to know from a few simple data points. Hence the argument over 7N vs 5N is moot. In reality, it's moot anyway since the M1 class SoCs are only made on 5N.

Even if we arbitrarily "assign" 32MB SLC (~15mm^2) to the CPU as cache (dubious reasoning given other parts of the SoC i.e GPU can directly access and benefit from the cache in ways that a normal CPU L3 cannot), that'd still be around 50mm^2 combined for the whole thing on 5nm. The majority being SRAM that (in your own words) doesn't shrink well for smaller nodes, and conversely also doesn't gain lots of size in bigger nodes.

Well, especially with the Max, there is also a lot of logic; NPUs, GPU (especially) and other fixed function units. So, yes, it would have been a mixed bag. I was a bit hyperbolic. Thanks for catching me on that. This is what I deserve for popping into a thread that I said I'd stay out of till we had reviews in hand and more info on the new M1 SoCs :p.
 

defferoo

Member
Sep 28, 2015
47
45
91
very strange that the scaling from Pro to Max is not 2x given the increase in memory bandwidth and cores, we need to see more benchmarks to see what's going on.
 

Mopetar

Diamond Member
Jan 31, 2011
7,842
5,994
136
I'll be so happy if mining is outlawed at some point. Not only is it a pyramid scheme (IMO) but it is terrible for the environment. If you're a miner you should be ashamed.

I've never found these arguments compelling. There's nothing that inherently makes mining inefficient, but any proof of work algorithm is going to create duplicate efforts. Most have realized that if they want something that's useful as a currency then alternatives like proof of stake are required in order to achieve a respectable transaction volume.

Cryptocurrency is also no more of a pyramid scheme than anything else. That it's a type of commodity only makes it susceptible to the same kinds of manipulations as the rest of the stock market. The phrase pump and dump existed long before BitCoin ever came around.

Further the biggest complaints are coming from gamers who want to use GPUs to crunch numbers to make pretty pixel arrangements for their own amusement. The computation resources going towards mining weren't being used for some more noble purpose prior to that point, so it's seriously difficult to make that argument either.

Your argument ultimately boils down to thing I don't like ought to be illegal because I don't like it. Never mind that your same reasoning would make other things that you do enjoy illegal. I don't even mine or have any cryptocurrency, but the sentiment against it is rather ridiculous.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
I've never found these arguments compelling. There's nothing that inherently makes mining inefficient, but any proof of work algorithm is going to create duplicate efforts. Most have realized that if they want something that's useful as a currency then alternatives like proof of stake are required in order to achieve a respectable transaction volume.

Cryptocurrency is also no more of a pyramid scheme than anything else. That it's a type of commodity only makes it susceptible to the same kinds of manipulations as the rest of the stock market. The phrase pump and dump existed long before BitCoin ever came around.

Further the biggest complaints are coming from gamers who want to use GPUs to crunch numbers to make pretty pixel arrangements for their own amusement. The computation resources going towards mining weren't being used for some more noble purpose prior to that point, so it's seriously difficult to make that argument either.

Your argument ultimately boils down to thing I don't like ought to be illegal because I don't like it. Never mind that your same reasoning would make other things that you do enjoy illegal. I don't even mine or have any cryptocurrency, but the sentiment against it is rather ridiculous.

The arguments against crypt coins is very compelling to me.

Coin mining is unnecessary busy work, that now consumes more power and creates more emissions than a small to medium sized country. In a time, when the future environment is at grave risk from excess consumption and emissions.

Crypto coins are not an actual commodity. They have no real utility, or baseline value. So their value could easily evaporate to the nothing which they are based on.

It's an expensive virtual frenzy, tail chasing something with no inherent value, but causing real harm.
 

jpiniero

Lifer
Oct 1, 2010
14,605
5,225
136
Crypto coins are not an actual commodity. They have no real utility, or baseline value. So their value could easily evaporate to the nothing which they are based on.

It's an expensive virtual frenzy, tail chasing something with no inherent value, but causing real harm.

The endgame is crypto replacing the US dollar as the reserve currency due to US financial policy (ie: see the pumping, runaway inflation that's going on, etc).
 

Mopetar

Diamond Member
Jan 31, 2011
7,842
5,994
136
The arguments against crypt coins is very compelling to me.

Coin mining is unnecessary busy work, that now consumes more power and creates more emissions than a small to medium sized country. In a time, when the future environment is at grave risk from excess consumption and emissions.

Crypto coins are not an actual commodity. They have no real utility, or baseline value. So their value could easily evaporate to the nothing which they are based on.

It's an expensive virtual frenzy, tail chasing something with no inherent value, but causing real harm.

Using GPUs for gaming is just unnecessary busy work though since the utility is entirely subjective to the person doing it. Should we outlaw high-end gaming because it's wasteful?

I'm not sure what to mean by cryptocurrency not being an actual commodity. It's a limited good that's bought and sold. It's by definition a commodity with the only difference from traditional commodities being that it's digital. Non-tangible property has been around for a long time and people buy and sell rights that have no more real existence than a BitCoin.

Similarly it's value arises as a secure means of exchange which is the inherent value of all currency. A US dollar isn't worth any particular value by decree, but is instead valuable because it enables exchanges of goods and services, can be used to pay taxes, and because there's a massive economy behind it. Stop the economic activity and the dollar becomes worthless because there's nothing to buy with it. Plenty of countries have discovered that the value of a dollar isn't its mere existence.

Cryptocurrency is doing nothing new outside of offering a decentralized system for exchanges that replace what we use banks for. Almost all of the fraud or examples of Ponzi schemes come from people that intentionally move transactions off of the blockchain and into something akin to a centralized bank which has absolutely no regulations unlike traditional banks.

It's neither a great evil that must be destroyed or some kind of savior of humanity that will usher in a new area. A lot of the value from BitCoin is just simply due to it having a limited supply that grows less slowly than other currencies/commodities and that if you live under a totalitarian government you can use a cryptocurrency to get your wealth out from under the control of that government.

Frankly it's not much different than TOR in a certain way. It's just a tool and people can use it for good or ill. Treating it like it's some kind of never before seen dark sorcery is just a fundamental misunderstanding of how it's not conceptually different from what we've already been doing or have already had for decades or even centuries.
 
  • Like
Reactions: lobz

Mopetar

Diamond Member
Jan 31, 2011
7,842
5,994
136
Do M1max and M1pro share the same TDPs? If so, it could be a result of hitting power limits with the larger chip.

Since you can put any of the configurations in the 14" model of the new MacBook Pro the cooling system must be build for the whatever the 32-core GPU variant of the Max can put out.

I doubt Apple has configured the Pro or cut down Max chips to take advantage of the extra headroom and boost clocks above the other configuration. At least not as far as the CPU is concerned since it would mean that the less expensive product could perform better in some way than the more expensive product. Apple doesn't want that. With the GPU this could occur to some extent because they're selling different number of cores and it's less efficient to just double the clock speed of a 16-core part to match performance of a 32-core part.

I imagine Apple has the chips locked down. If we do get any desktops using these expect higher clocks, just like we saw with the M1 that went into the Mac Mini.
 

Eug

Lifer
Mar 11, 2000
23,587
1,001
126
I imagine Apple has the chips locked down. If we do get any desktops using these expect higher clocks, just like we saw with the M1 that went into the Mac Mini.
???

The M1 Mac mini has the same clock speed as the M1 MacBook Pro and even M1 MacBook Air. Same goes for the 24" iMac.

All are 3.2 GHz. In fact, even the iPad Pro M1 is 3.2 GHz.
 

DrMrLordX

Lifer
Apr 27, 2000
21,634
10,851
136
I'll be so happy if mining is outlawed at some point. Not only is it a pyramid scheme (IMO) but it is terrible for the environment. If you're a miner you should be ashamed.


Here you go! Prepare to bow to your crypto-mining overlords.

(don't worry, ETH mining stops in June at the latest so, lighten up a little. Nobody in their right mind will be buying up M1X Macbook Pros when they could never achieve RoI)
 

LightningZ71

Golden Member
Mar 10, 2017
1,628
1,898
136

Here you go! Prepare to bow to your crypto-mining overlords.

(don't worry, ETH mining stops in June at the latest so, lighten up a little. Nobody in their right mind will be buying up M1X Macbook Pros when they could never achieve RoI)

Eth went POS years ago... right? They swore it was coming "real soon now!" Any second...

There's no need to worry about recovering the original purchase price as used MACS still go for near new prices. Of someone actually builds optimized libraries and hashes, it's over.
 

DrMrLordX

Lifer
Apr 27, 2000
21,634
10,851
136
Eth went POS years ago... right? They swore it was coming "real soon now!" Any second...

It's finally here. May 2022 is the latest "go live" date for the merge, with mining supposedly ending in June 2022. Not sure that Macbook Pros used for mining would hold all of their value, but you never know!
 

Red_m

Junior Member
Aug 29, 2021
6
23
36
I'm still slightly dissapointed at 0% ST gains compared to M1 though.
Apple likely did not want to push clocks as that more heat and more power draw for minimal speed increase.
The 16-core score corresponds to what's expected from M1 results. The 32-core score seems too low.

The M1 Pro score is twice the M1, which is more expected.
Keep in mind thats OpenCL. OpenCL had been depreciated in macOS and is rotting.

Metal benchmarks are the ones we need to look out for.