Discussion Apple Silicon SoC thread

Page 329 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,807
1,385
126
M1
5 nm
Unified memory architecture - LP-DDR4
16 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 12 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache
(Apple claims the 4 high-effiency cores alone perform like a dual-core Intel MacBook Air)

8-core iGPU (but there is a 7-core variant, likely with one inactive core)
128 execution units
Up to 24576 concurrent threads
2.6 Teraflops
82 Gigatexels/s
41 gigapixels/s

16-core neural engine
Secure Enclave
USB 4

Products:
$999 ($899 edu) 13" MacBook Air (fanless) - 18 hour video playback battery life
$699 Mac mini (with fan)
$1299 ($1199 edu) 13" MacBook Pro (with fan) - 20 hour video playback battery life

Memory options 8 GB and 16 GB. No 32 GB option (unless you go Intel).

It should be noted that the M1 chip in these three Macs is the same (aside from GPU core number). Basically, Apple is taking the same approach which these chips as they do the iPhones and iPads. Just one SKU (excluding the X variants), which is the same across all iDevices (aside from maybe slight clock speed differences occasionally).

EDIT:

Screen-Shot-2021-10-18-at-1.20.47-PM.jpg

M1 Pro 8-core CPU (6+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 14-core GPU
M1 Pro 10-core CPU (8+2), 16-core GPU
M1 Max 10-core CPU (8+2), 24-core GPU
M1 Max 10-core CPU (8+2), 32-core GPU

M1 Pro and M1 Max discussion here:


M1 Ultra discussion here:


M2 discussion here:


Second Generation 5 nm
Unified memory architecture - LPDDR5, up to 24 GB and 100 GB/s
20 billion transistors

8-core CPU

4 high-performance cores
192 KB instruction cache
128 KB data cache
Shared 16 MB L2 cache

4 high-efficiency cores
128 KB instruction cache
64 KB data cache
Shared 4 MB L2 cache

10-core iGPU (but there is an 8-core variant)
3.6 Teraflops

16-core neural engine
Secure Enclave
USB 4

Hardware acceleration for 8K h.264, h.264, ProRes

M3 Family discussion here:


M4 Family discussion here:

 
Last edited:

SpudLobby

Senior member
May 18, 2022
976
670
106
Read those two paragraphs side by side...

The Apple modem will be "comparable" to the QC modem. It will probably be superior along the dimensions Apple cares about (most importantly energy) and possibly slightly inferior in terms of peak bps.

Either way IT WILL NOT MATTER for anyone except the dick-measuring brigade, for precisely the reasons encapsulated in Eug's post. People are happy to insist that some network metric is the most important thing in the universe -- but revealed preference (when it's your own money you have to spend) shows that most people are perfectly happy with what they have right now...
I honestly won’t be surprised if it’s significantly worse in a technical sense, see: band compatibility, power, reception in poor coverage


IOW: would not expect it to be better but don’t expect it to be as bad as Intel modems in say 2017-2019, which were really inferior to Qualcomm’s stuff in iPhones, notably so. Or similarly the X55 in the iPhone 12 was much worse than the X60 in the 13, both on feature sets with 5G and power or reception (all due to both the node and the new bands or aggregation stuff).

Even then, it didn’t break the iPhone, and where we are in 5G I strongly suspect it won’t be any kind of clusterfuck like the Intel modems or the X55.

Nonetheless I think comparing Qualcomm RF to Intel’s CPUs in Apple’s vertical integration attempts is very funny and will age poorly even if Apple modems end up fine.
 

SpudLobby

Senior member
May 18, 2022
976
670
106
So despite the rumors it is the same A18 die. There would be more differences than just one GPU core if they were doing two dies.
Yeah what’s also interesting then is it means the A18 Pro features they listed off besides the gpu core are true for the A18 too, but they didn’t bother mentioning that.

That said, they claimed the A18 Pro has larger caches when mentioning the CPU specifically, I wonder if this again is true for the A18 as well and it’s just bad communication?
 

jdubs03

Senior member
Oct 1, 2013
803
408
136
I honestly won’t be surprised if it’s significantly worse in a technical sense, see: band compatibility, power, reception in poor coverage


IOW: would not expect it to be better but don’t expect it to be as bad as Intel modems in say 2017-2019, which were really inferior to Qualcomm’s stuff in iPhones, notably so. Or similarly the X55 in the iPhone 12 was much worse than the X60 in the 13, both on feature sets with 5G and power or reception (all due to both the node and the new bands or aggregation stuff).

Even then, it didn’t break the iPhone, and where we are in 5G I strongly suspect it won’t be any kind of clusterfuck like the Intel modems or the X55.

Nonetheless I think comparing Qualcomm RF to Intel’s CPUs in Apple’s vertical integration attempts is very funny and will age poorly even if Apple modems end up fine.
They know what the X75 can do. The X80 comes out in a few months, which will be in next years devices (rumor is just the Pros). So that really is their target. I doubt they’d want to release something that is over a generation behind the latest component they put in their flagship phone (and would be two generations behind in only a few months).
Ideally, they wouldn’t release something until it could be on the same level.
 

The Hardcard

Senior member
Oct 19, 2021
207
302
106
They know what the X75 can do. The X80 comes out in a few months, which will be in next years devices (rumor is just the Pros). So that really is their target. I doubt they’d want to release something that is over a generation behind the latest component they put in their flagship phone (and would be two generations behind in only a few months).
Ideally, they wouldn’t release something until it could be on the same level.
They can only do want they can. The issue is patents. When Qualcomm started working on CDMA, most of the rest of the industry thought it was physically impossible, or at least not worth the effort. Qualcomm scored big time on that. it is almost impossible to have a modem targeting the US market that is both able to avoid Qualcomm patents and not be inferior in capability and/or power consumption.

You have to really want to not use Qualcomm‘s products to try and find another path since a company needs tremendous capital to dump on a expensive project that at best won’t pay off for many years now.
 

FlameTail

Diamond Member
Dec 15, 2021
3,855
2,297
106
They can only do want they can. The issue is patents. When Qualcomm started working on CDMA, most of the rest of the industry thought it was physically impossible, or at least not worth the effort. Qualcomm scored big time on that. it is almost impossible to have a modem targeting the US market that is both able to avoid Qualcomm patents and not be inferior in capability and/or power consumption.

You have to really want to not use Qualcomm‘s products to try and find another path since a company needs tremendous capital to dump on a expensive project that at best won’t pay off for many years now.
There are many phones in the US (especially low end ones), which use non-Qualcomm modems such as Samsung or Mediatek.
 

Doug S

Platinum Member
Feb 8, 2020
2,742
4,664
136
Yeah what’s also interesting then is it means the A18 Pro features they listed off besides the gpu core are true for the A18 too, but they didn’t bother mentioning that.

That said, they claimed the A18 Pro has larger caches when mentioning the CPU specifically, I wonder if this again is true for the A18 as well and it’s just bad communication?

Hard to believe they'd do a custom die just to mess with the cache size. How could it be worth a second die just to save a couple mm^2? Whatever they may have done it doesn't appear to show up in GB 6.3 results.

Hypothetically if they wanted to bin on P core shared L2 or on SLC, if they added some fuses they could blow to chop it down by 25% they could bin on that as well as the GPU core. I mean, that's more chip area to bin on but they're still likely to be cutting out working cache and GPU core on 90%+ of the non-Pros whatever they're binning on.

I've always wondered if they will bin on power between the Pro and non Pro lines. Even at Apple's fixed frequency there will be dies that run at a lower voltage. They seem to use a fixed voltage along with the fixed frequency - at least I think there's good evidence they have done this so far - but if they wanted to make the Pro line "better" to help justify the additional cost they could give it the SoCs that operate at lower power. Makes sense given that it has to power an extra GPU core (and maybe a little extra cache?)
 
  • Like
Reactions: SpudLobby

poke01

Platinum Member
Mar 8, 2022
2,094
2,626
106
Enjoying a bit of drama unfolding in a Teams meeting with participants (including directors and CEOs) from all over the country and this one dude who was supposed to do a presentation about his properties company is using a frickin' iPad and he doesn't know how to use it or its limitations while using Teams. So after wasting about 10 minutes for him to get his act together, he is now going to share the presentation to someone with an actual computing device so he can voiceover. Such amazing and "free" publicity for the iPad.

I bet the presentation would've gone better if he had let his 5 year old kid help :D
the iPad is a tablet, running a table os, that guy should got a surface pro or a macbook. Theres a reason why Apple still sells macbooks.

Must have been funny, lol
 

poke01

Platinum Member
Mar 8, 2022
2,094
2,626
106
Anyway…

I’m expecting big increases in battery life for these iPhones 16 pros. They finally mentioned better battery life in the keynote and the last time they said was for the 13 pros.

The iPhone 13 Pro Max had better battery life than the iPhone 14PM and even 15PM was mediocre.
 

DZero

Member
Jun 20, 2024
126
56
61
Anyway…

I’m expecting big increases in battery life for these iPhones 16 pros. They finally mentioned better battery life in the keynote and the last time they said was for the 13 pros.

The iPhone 13 Pro Max had better battery life than the iPhone 14PM and even 15PM was mediocre.
How about the 12 Pro Max?
 

The Hardcard

Senior member
Oct 19, 2021
207
302
106
CDMA 3G is dead, no reason to be limited to Qualcomm modems in the US anymore or using them only for AT&T/T-mobile like Apple did with their Intel modems.
I got blurry with the situation. All 5G uses Qualcomm tech. Samsung and Mediatek have to pay Qualcomm and do so.

Apple was trying to get around at least some of Qualcomm patents. There are a couple of Qualcomm patents that they went to court to have them removed, but they lost. Some articles claim they would have still had to pay Qualcomm no matter what to have a working modem. Maybe they are trying to minimize the amount.
 

The Hardcard

Senior member
Oct 19, 2021
207
302
106
This goes to show that TOPS is a meaningless metric. Benchmarks confirm this as well you need good APIs to advantage.
TOPS Is a meaningful metric, but as Jensen says, in his presentations doing the software work is the much larger task, Hoover building the hardware. Nvidia, which is so much further along than everyone else continues to release software frameworks that massively boost the capabilities of their chips. just last month, they released new software that boosted the Llama 3.1 throughput of the H100 by 30 percent.

If Nvidia can make those huge boost from software optimizations In late 2024, that probably means that none of these chips are anywhere near fully optimized yet. just like Apple getting a 25% boost with the upcoming iOS/macOS releases. there are probably several more big boosts from optimizations to come for every player.

But you must have the TOPS to optimize the TOPS. It does mean that it is impossible to do straight hardware comparisons. AI benchmarks are always going to be hardware and software framework mixtures.
 

Doug S

Platinum Member
Feb 8, 2020
2,742
4,664
136
I got blurry with the situation. All 5G uses Qualcomm tech. Samsung and Mediatek have to pay Qualcomm and do so.

Apple was trying to get around at least some of Qualcomm patents. There are a couple of Qualcomm patents that they went to court to have them removed, but they lost. Some articles claim they would have still had to pay Qualcomm no matter what to have a working modem. Maybe they are trying to minimize the amount.

There are a ton of LTE and 5G patents owned by many companies - including some owned by Apple. They're all covered by FRAND so they are generally not that expensive (though Qualcomm controversially charges based on the value of the device the chip goes into, which many believe is against the spirit of FRAND) Qualcomm has more cellular patents than about anyone (I think Huawei may hold more 5G patents) so they are paid in a lot more than they have to pay out - but they do have to pay Apple for the LTE/5G patents Apple owns - its just that Qualcomm owns a lot more so Qualcomm is obviously the winner in that.

There's no point in trying to work around patents that are part of the 5G standard - they are FRAND because they are part of the standard that must be implemented. The patents Apple tries to work around / invalidate are ones that Qualcomm claims on smartphones themselves or otherwise incidental to cellular standards. That's where Qualcomm really gets you when you buy chips from them, because they force you to license those patents in order to get them to sell you their modems. When Apple is able to get their own modems going they'll free themselves from that. Because if Qualcomm tried to enforce their smartphone related patents against Apple absent the chip supply contract, Apple has a crapton of CPU related patents they would be able to enforce on Qualcomm. So basically that's not a battle Qualcomm would be willing to fight.
 

Mopetar

Diamond Member
Jan 31, 2011
8,104
6,730
136
They can only do want they can. The issue is patents. When Qualcomm started working on CDMA, most of the rest of the industry thought it was physically impossible, or at least not worth the effort. Qualcomm scored big time on that. it is almost impossible to have a modem targeting the US market that is both able to avoid Qualcomm patents and not be inferior in capability and/or power consumption.

It doesn't matter since Qualcomm (or anyone with patents that are part of the standard used) has to license them under FRAND (fair reasonable and non-discriminatory) terms. Apple also bought a bunch a patents years ago so every company is paying them for the use of any that are part of any standard where they're necessary and Apple has to license those patents.

Qualcomm may have other patents which aren't part of any standard, but Apple wouldn't need those in order to build a functional cellular modem. If it seems like it's hard for other companies to match what Qualcomm can do it's because that's Qualcomm's core business that they've been involved in for nearly four decades.
 

Doug S

Platinum Member
Feb 8, 2020
2,742
4,664
136
Qualcomm may have other patents which aren't part of any standard, but Apple wouldn't need those in order to build a functional cellular modem. If it seems like it's hard for other companies to match what Qualcomm can do it's because that's Qualcomm's core business that they've been involved in for nearly four decades.

Qualcomm being in phones all over the world since the beginning is their biggest advantage. The RF frontend, ADCs, demodulators etc. are all fairly standard stuff (OK RF is a bit of a black art, but there are many capable RF engineers out there) That is to say, the hardware part of the modem is comparatively easy.

The hard (really hard) part is the baseband software. That's where Qualcomm being in phones all over the world is a massive advantage. Like most standards, the LTE & 5G specs that detail interfacing client devices like smartphones with towers doesn't specify every little detail. A lot is "implementation defined", and in many cases that is defined as "how Qualcomm does it" because they've been there since the beginning. Then there is all the stuff related to handoffs between towers, roaming, and so forth where there are even fewer defined standards.

If all Apple wanted to do was make a modem to handle the US they could have been there years ago. There are few true carriers, most of the multiplicity you see like Mint are MVNOs of true carriers like AT&T or Verizon. But if you want a modem that handles the entire world, which is what they need to avoid having different SKUs like Samsung has for ages despite having their own modem, you need to handle all those carriers all over the world. With all their idiosyncracies for tower handoff, for roaming, etc.

The good news is that when you screw up that latter stuff usually the worst you get is a dropped call. That's less of a problem now that everything is data so you aren't circuit switching - Apple can do a modem that does VoLTE/Vo5G only for calls, and handles LTE/5G only. A lot of the ugliest stuff is found in 2G/3G standards but Apple can simply ignore all that and not implement it. But it is still a really really big effort that's going to require testing all over the world. They can't do this stuff in a lab in Cupertino.
 

SpudLobby

Senior member
May 18, 2022
976
670
106
IMG_5854.jpeg
And there it is. Larger CPU caches than the A18. Could just be SLC though, but it might be L2. It is interesting they didn’t mention these power figures for the regular A18 vs the A17 Pro.
 

The Hardcard

Senior member
Oct 19, 2021
207
302
106
It doesn't matter since Qualcomm (or anyone with patents that are part of the standard used) has to license them under FRAND (fair reasonable and non-discriminatory) terms. Apple also bought a bunch a patents years ago so every company is paying them for the use of any that are part of any standard where they're necessary and Apple has to license those patents.

Qualcomm may have other patents which aren't part of any standard, but Apple wouldn't need those in order to build a functional cellular modem. If it seems like it's hard for other companies to match what Qualcomm can do it's because that's Qualcomm's core business that they've been involved in for nearly four decades.
I don’t know and can’t speak factually, but I got the impression that the issues wasn’t based on meeting the standards, but how to do operate a modem efficiently. There were claims the at one point Apple had a oversized modem that was half the size of the iPhone logic board and required far more than practical power to run. Nothing about not handling the standards.

There is also the issue of Apple trying to legally force relief from two Qualcomm patents that are in effect until 2029. It would without knowing seem that these are outside of FRAND yet still affecting their modem aspirations. So probably not standards but performance and efficiency.

Though according to some industry analysts the “delay” is due mostly to the fact that it is taking the natural amount of time to produce the modem, but Apple’s early targets were unrealistic.