Discussion Apple Silicon SoC thread

Page 39 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,583
996
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:

 
Last edited:

nxre

Member
Nov 19, 2020
60
103
66
For the people who wished for a cheaper Mac from Apple they just tore your dreams apart on this new interview
(And FYI: we also asked if Apple plans to introduce cheaper Macs, on the assumption that using its own silicon is more economical. "We don't do cheap—you know that," Joswiak admitted. "Cheap is for other people, because we try to build a better product.")
 

Eug

Lifer
Mar 11, 2000
23,583
996
126
Nah. A $899 MBA doesn't do it. It barely makes a difference. No one will bat an eye. It's like if Apple priced the iPhone SE $100 cheaper than the iPhone 12 Mini ($700). No one will care about the SE if it's priced at $600.

Now a $700 or $750 MBA? That's something.
You really should make a prediction and stick to it.

You said $700. I said not a chance.

Then you revised your statement to say $800 is what you actually meant but you said $700 because $700 is more catchy.

Geez man, seriously?

If you have to revise your statement then just revise your statement instead of making up some sort of lame excuse thinking it's gonna give you a free credibility pass. Cuz it doesn't.
 

mikegg

Golden Member
Jan 30, 2010
1,740
406
136
You really should make one prediction and stick to it.

You said $700. I said not a chance.

Then you revised your statement to say $800 is what you actually meant but you said $700 because $700 is more catchy.

Geez man, seriously?
$700. $750. $800. $700 is what I think makes the most sense if I'm running Apple. $750 is probably what they will do. $800 is borderline meh.

For the people who wished for a cheaper Mac from Apple they just tore your dreams apart on this new interview

Apple will never make something that is "cheap", which is what he was saying. They're never going to go to the low end. They will always be above the low-end.
 

Eug

Lifer
Mar 11, 2000
23,583
996
126
Weird. I just saw on MacRumors that Apple has logic boards listed for M1 Mac minis with 10 GigE, for repair shops.


I wonder if these will eventually prove to be unavailable, given that you can't actually order M1 Mac minis with 10 GigE.

Anyhow, I had though this was going to be a differentiating feature. They were going to have the M1 Mac mini with Gigabit Ethernet, and then later sell a higher end model with the 10 GigE option and more ports.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Here's the thing. Hardware accelerated encoding is very nice for real time or performance sensitive encoding. If I were going to stream a game on YouTube, I would certainly choose NVENC over anything else. But for movies, I would choose a software solution because if you want maximum quality and flexibility, offline encoding is the best.

It's similar to why offline rendering is still usually done on CPUs, even to this day.



They said this in that article:
One thing to keep in mind is that our testing was done with "VBR, 1 pass" since hardware encoding currently doesn't support 2 pass encoding. If you were to use 2 pass encoding with software encoding, the quality difference would be a bit more pronounced (although it would also take significantly longer).

Comparing 2-Pass to CBR, is a nonsense comparison. These days they only reason to use CBR, is for streaming, and obviously you can't use 2-pass for streaming. It sounds like he doesn't really understand encoding.

For offline encoding you do VBR. VBR is superior to CBR for what should be obvious reasons. There are two ways to do VBR:

a) Constant Quality. The most used VBR method. Usually indicated CR/CRF/CQ. This sets a quality target and varies data rate to achieve that quality as it goes. The HW encoders typically support constant quality.
b) 2 Pass is also VBR, it just runs the whole file through once to determine where to spend a fixed budget. This was big thing years ago when people insisted on cramming DVD rips on CD-Roms. But has little other use other than fixed sizing to place on optical disks.

If you do a Constant Quality run first, then a 2 Pass targeting the same file size, the quality will be the same. So Constant quality achieves the same quality as 2 Pass, but does it faster, which is why 2 Pass is almost never used by people who understand what is going on anymore.

On the streaming side, I have seen a lot of comparison since Turing's upgrade to NVenc. The consensus for streaming is Use NVenc if you have NVidia, x264 if you don't. HW encoders are not created equal. But right now for streaming. NVenc is so close to x264, that if you have a NVidia cards, it's the streaming method of choice. Intel is next in line, but considered inferior to NVenc/x264. AMD is considered dead last for HW H.264 streaming.

Unfortunately, you don't see much coverage of offline quality of HW encoders, and little about Apple HW encoding.

Hopefully as M1 Macs roll-out we will see some more detailed analysis.
 
  • Like
Reactions: Carfax83

gdansk

Golden Member
Feb 8, 2011
1,988
2,359
136
Honestly the battery life is really impressive. If I didn't buy a MBP16 last year I'd probably pick up an Air.
Anyone want to buy a heavily discounted Intel flamethrower? :p
 

name99

Senior member
Sep 11, 2010
404
303
136
For the people who wished for a cheaper Mac from Apple they just tore your dreams apart on this new interview


Maybe you are not a native English speaker? Or even from a different English speaking community within the US?
Cheap is a somewhat ambiguous English word. People think it means "low price" but it really means something like "low quality"; since low price and low quality usually go together you can see how the ambiguity arises...

Apple don't have a problem with low-priced items. But they aren't willing to sacrifice quality (make it "cheap") to hit that lower price; things like a lesser battery or a nasty screen, the ways the $400 laptops hit their price points.
 

Doug S

Platinum Member
Feb 8, 2020
2,203
3,405
136
Weird. I just saw on MacRumors that Apple has logic boards listed for M1 Mac minis with 10 GigE, for repair shops.


I wonder if these will eventually prove to be unavailable, given that you can't actually order M1 Mac minis with 10 GigE.

Anyhow, I had though this was going to be a differentiating feature. They were going to have the M1 Mac mini with Gigabit Ethernet, and then later sell a higher end model with the 10 GigE option and more ports.

The M1 Mac Mini is the "low end" Mini. They still have to sell the high end Mini, which will presumably come next year with the 8+4 chip. Maybe they plan to offer 10GbE as an option for that? I mean, connect a disk array to it via TB and give it 10 GbE and you've got yourself a departmental NAS.
 

Eug

Lifer
Mar 11, 2000
23,583
996
126
The M1 Mac Mini is the "low end" Mini. They still have to sell the high end Mini, which will presumably come next year with the 8+4 chip. Maybe they plan to offer 10GbE as an option for that? I mean, connect a disk array to it via TB and give it 10 GbE and you've got yourself a departmental NAS.
Yes, the 8+4 Mac mini next year will likely get the 10 GigE option, but the 10 GigE mobos in question are specifically 8-core (aka 4+4) models, and are available for order right now.

m1-mac-mini-10gb-ethernet-parts-list.jpg


---

Chrome native for M1 is much, much faster than the Intel version.

 

IntelCeleron

Member
Dec 10, 2009
41
0
66

Eug

Lifer
Mar 11, 2000
23,583
996
126
Judging by the various reviews out there, M1 is not really built to handle 8K video footage or some other types of >4K footage. It works, but the jerkiness in the timeline plaguing other Macs makes an appearance quite often. According to SoC monitoring software, it often maxes out the "GPU" although I'm not entirely sure what that means. Don't ask me what GPU monitor applet it is cuz I don't know, but I do note that Activity Monitor includes a GPU usage monitor as well. In contrast, the CPU cores aren't breaking a sweat.

n00b question:

Would it really be just the GPU, or does that include that image signal processor outside the GPU and CPU? If it truly is GPU, then increasing the number of GPU cores would likely solve the problem. However, if it's the dedicated ISP outside the GPU, then they'd have to augment that somehow in order to deal with this issue.

IOW, should M1X be more CPU cores + more GPU cores with the same existing ISP, or should the ISP also get a boost?
 

moinmoin

Diamond Member
Jun 1, 2017
4,934
7,619
136
AMD CPUs are in a fair number of laptops, but IMO most of them are junk. They seem to lean towards the market with bad battery life and annoying international keyboards, at least around here. All the nicer WIndows laptops with good battery life seem to be Intel.

Maybe that will change with the new cores, but in the past AMD CPU in a laptop = red flag. I wonder how long it will take for AMD to shake that reputation.
While you are spot on with how laptop OEMs treated AMD chips in the past, this actually took a 180° turn this past spring with Ryzen Mobile 4000/Renoir which reviewed unexpectedly well and since has had way more demand from OEMs and consumers than supply available. Everything you wrote is already in the past as of now.

You got videos like this one at the launch (first 1:20min are sufficient for a first impression):

As I wrote before it's a good thing for competition that Apple joins and surpasses the laptop competition with M1 since otherwise AMD could now lay back with the progress it has been making.

(The real story in there is how much Intel has been dropping balls in the last couple years.)
 
Last edited:
  • Like
Reactions: teejee and Tlh97

name99

Senior member
Sep 11, 2010
404
303
136
Judging by the various reviews out there, M1 is not really built to handle 8K video footage or some other types of >4K footage. It works, but the jerkiness in the timeline plaguing other Macs makes an appearance quite often. According to SoC monitoring software, it often maxes out the "GPU" although I'm not entirely sure what that means. Don't ask me what GPU monitor applet it is cuz I don't know, but I do note that Activity Monitor includes a GPU usage monitor as well. In contrast, the CPU cores aren't breaking a sweat.

n00b question:

Would it really be just the GPU, or does that include that image signal processor outside the GPU and CPU? If it truly is GPU, then increasing the number of GPU cores would likely solve the problem. However, if it's the dedicated ISP outside the GPU, then they'd have to augment that somehow in order to deal with this issue.

IOW, should M1X be more CPU cores + more GPU cores with the same existing ISP, or should the ISP also get a boost?

I think we're going to see, over time, some fascinating changes here.
NPU, GPU, ISP, even media encoder, all, from the thousand foot level, perform the same sort of thing --- many many MANY small engines all operating in parallel on (hopefully) independent pieces of data.

How valuable will it remain to keep these separated (and, of course, slightly more optimized for each particular task) rather than creating a single super-throughput engine that has the specialist capabilities of an NPU (small integer multiplication mainly?), of a GPU (texture lookup and things like that), of a media encoder (I assume mainly the ability to compare slightly shifted pixel blocks against each other to find best matches), of an ISP (no idea what those are!) available to each unit; but all based on a common framework of registers, instruction scheduling, synchronization, cache, and so on?

There an obvious win to this (the system can dynamically expand performance to whatever is the current task -- your GPU is 2x as fast if it can recruit all the computation latent in the NPU, ISP, and h.265 encoder!). And it's designing ONE thing rather than many, and writing more common code.
And there's a cost. More generality means more power usage, and spreading usage this way may require slightly more area.

Is it an overall win? My guess is it could actually be, once someone has had time to figure out the optimal set of common primitives. But will it happen soon (soon means, say, before 2025)? Not a clue! I have no idea what the degree of commonality between these blocks is once you start drilling into the details.
 

Eug

Lifer
Mar 11, 2000
23,583
996
126
While you are spot on with how laptop OEMs treated AMD chips in the past, this actually took a 180° turn this past spring with Ryzen Mobile 4000/Renoir which reviewed unexpectedly well and since has had way more demand from OEMs and consumers than supply available. Everything you wrote is already in the past as of now.

You got videos like this one at the launch (first 1:20min are sufficient for a first impression):

As I wrote before it's a good thing for competition that Apple joins and surpasses the laptop competition with M1 since otherwise AMD could now lay back with the progress it has been making.

(The real story in there is how much Intel has been dropping balls in the last couple years.)
First off, the guy throwing the laptop for Linus to catch actually made me physically cringe. They put fear into me that he would drop it. :oops:

Good to know about the recent AMD laptops, but my bias may be showing here, because all my colleagues are in the business crowd and are heavy travellers (Covid notwithstanding), so I've been steering them toward ultrabooks, which they always love, so more in the 15 Watt TDP tier. How well has AMD they been doing in the market there? I know the machines exist, but I haven't seen a ton of them locally. Then again since the beginning of Covid I haven't been doing a lot of in-person laptop shopping.
 

moinmoin

Diamond Member
Jun 1, 2017
4,934
7,619
136
First off, the guy throwing the laptop for Linus to catch actually made me physically cringe. They put fear into me that he would drop it. :oops:

Good to know about the recent AMD laptops, but my bias may be showing here, because all my colleagues are in the business crowd and are heavy travellers (Covid notwithstanding), so I've been steering them toward ultrabooks, which they always love, so more in the 15 Watt TDP tier. How well has AMD they been doing in the market there? I know the machines exist, but I haven't seen a ton of them locally. Then again since the beginning of Covid I haven't been doing a lot of in-person laptop shopping.
Renoir is eating Intel for breakfast even in the ultrabooks segment.


Availability is a major issue since apparently nobody (neither AMD nor the OEMs) expected the demand it'd cause.
 

Eug

Lifer
Mar 11, 2000
23,583
996
126
Do note that AMD reported the "highest ever quarterly mobile processor unit shipments and revenue" in its last financial results. And according to Mercury Research AMD reached an all time high in mobile (i.e. laptop) market share of 19.9% earlier this year.
Wouldn't that suggest the laptops are ramping up more this quarter, as opposed to last quarter?

EDIT:

Yes it does:

AMD's mobile processor sales doubled YoY, and another 30+ models will join the 50+ notebooks on the market by the end of the year. That means AMD has a solid runway for more growth.
 

moinmoin

Diamond Member
Jun 1, 2017
4,934
7,619
136
Wouldn't that suggest the laptops are ramping up more this quarter, as opposed to last quarter?

EDIT:

Yes it does:

AMD's mobile processor sales doubled YoY, and another 30+ models will join the 50+ notebooks on the market by the end of the year. That means AMD has a solid runway for more growth.
I would have grown even faster earlier if it weren't for shortages. It was common sight that OEMs announced laptop models which they were then unable to ship with shipment date delayed further and further out as they didn't get the necessary processors. There were even some cases of models getting altogether canceled again.

Anyway all this is pretty off topic in this thread. Point is AMD in laptops is in a huge upswing right now. Apple won't have any troubles selling its M1 Macs either. The big loser in both cases is Intel.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,670
1,250
136
Anyway all this is pretty off topic in this thread. Point is AMD in laptops is in a huge upswing right now. Apple won't have any troubles selling its M1 Macs either. The big loser in both cases is Intel.

I think M1 is way worse for Intel than just lost Mac revenue, because it brings performance and efficiency to the front of everyone's mind. AMD's products just compare better against the M1 than Intel's do, and by a huge margin, and everyone in Windows-land is going to be looking for products they can mentally justify and be proud of versus the new Macs. If Cezanne doesn't take too long to show up, and AMD can keep up with demand, then they can make huge strides in laptops.
 

Eug

Lifer
Mar 11, 2000
23,583
996
126

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
Mediocre showing overall ...

Even with a process technology advantage, the M1 only manages to be somewhat competitive against the latest x86 options in situational cases. Rosetta 2 is proving to be a disaster for higher performance since Apple can barely keep up with either AMD or Intel ...

The fact that Apple had to capitalize on the latest 5nm process and they can't even resoundingly win against their x86 competitors should raise tons of red flags behind the future of the ARM architecture because their only window of opportunity through process technology is becoming shorter as Moore's Law will potentially grind to a halt for extended periods of time. TSMC can only make promises that they'll be able to scale down to 3nm and eventually 2nm but they stop making any hard guarantees after that. We might be able to see maybe one more shrink after 2nm before the quantum tunneling effect starts dominating circuit behaviour ...

All AMD (or Intel) has to do is wait until they can transition to the latest process as well and they'll be able to automatically undo any of the gains that either Apple or any ARM vendor achieved in their designs ...