• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question New Apple SoC - M1 - For lower end Macs - Geekbench 5 single-core >1700

Page 66 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

UNCjigga

Lifer
Dec 12, 2000
22,797
5,138
136
Well we already knew they'd have to have something better for higher end Macs so it is hardly a surprise. But this doesn't guarantee a one year cadence. They could have a beefier "M" SoC for midrange Macs in 2021 but not update the M1 until 2022, and not update the beefier one coming out this year until 2023. Ditto for the high end one going in the Mac Pro / iMac Pro in 2022 (whether it is discrete or chiplet based)

They might be able to offer some minor bumps in performance, i.e. perhaps an M1 fabbed this fall would be made on N5P and get a few hundred MHz more but be otherwise the same)

Not saying they will do this, or that they will continue revving the iPad 'X' chip every other year in the future when they will have all the stuff the M1 is currently going in adding to the wafer total for that design. Just that we can't assume it will be yearly just because they are taking the very obvious step of making a different chip for midrange Macs coming out later this year.
Don't forget there's an M-derived SoC going into their VR/AR headset (I'm assuming that'll need a beefier GPU vs. AnX)--I think we'll see that in 2021/2022 but might be generation 1 or 1.5 vs. true second-generation M-class chip.
 

Doug S

Senior member
Feb 8, 2020
504
702
96
Don't forget there's an M-derived SoC going into their VR/AR headset (I'm assuming that'll need a beefier GPU vs. AnX)--I think we'll see that in 2021/2022 but might be generation 1 or 1.5 vs. true second-generation M-class chip.
You're assuming it will run standalone, rather than use an iPhone to do the heavy lifting. I think it will be similar to the Apple Watch, and the early versions will only work with an iPhone. As with the Watch, they can eventually free it of its tether and operate standalone but that will be several years / product generations away.

Running a couple 8K displays isn't particularly taxing, if someone else is doing all the expensive rendering type stuff so it only has to shift bits around in a framebuffer.
 

lobz

Golden Member
Feb 10, 2017
1,565
1,981
136
When this was announced (and I expected Apple Silicon to perform about as well as it ended up doing) I thought that if Apple was able to double their market share to ~15% in five years that would exceed their wildest projections. Getting comfortably into double digits is probably their goal for the next three.

They are "comfortably in the double digits" in the mobile market (a bit higher in the smartphone market) and that's about as good as they can ever do given that they sell premium devices. The market for premium PCs is larger as a percentage (because the really poor people that are a majority of the world's population who do own very low end smartphones or feature phones do not and never will own PCs) but Windows has an entrenched advantage they didn't face when they entered the mobile market.
Windows' entrenched advantage will remain intact as long as Apple will continue this smartbum arrogant attitude towards their own customers. I know that Microsoft does that too, but when people get fed up with that, they usually move towards something better and not something even more condescending. The company's 'you're holding it wrong' attitude has not changed a bit in the past decade.
 
Mar 11, 2004
20,846
3,009
126
You're assuming it will run standalone, rather than use an iPhone to do the heavy lifting. I think it will be similar to the Apple Watch, and the early versions will only work with an iPhone. As with the Watch, they can eventually free it of its tether and operate standalone but that will be several years / product generations away.

Running a couple 8K displays isn't particularly taxing, if someone else is doing all the expensive rendering type stuff so it only has to shift bits around in a framebuffer.
Do you really think an iphone would be capable of that type of processing, or that you think pushing dual 8K over wireless would be feasible like that? If they're gonna tether it, it'll probably be to something beefier than an iPhone and require an entirely new wireless protocol or require a cable (imagine the stand they could come up for that!). If that were the case I'd guess it might actually be for the updated Mac Pro, kinda like their display was shown off alongside the new cheesgrater) and I think this rumored high end headset would target a somewhat similar market (visual artists or pros working on AR/VR live events like sports, concerts; I'm thinking it'll be targeted more at corporations, think car companies being able to setup small kiosks where you could put on the headset and walk around high quality digital versions of their cars, and similar). I think there's rumors that there's going to be update to the current Mac Pro but also a new Mx based Mac Pro that will be quite compact (curious if they'd bring the trash can back or if it'll be something else). Maybe they'd come up with a backpack frame that you can mount it and possibly some batteries to to use with the headset.
 

Mopetar

Diamond Member
Jan 31, 2011
5,400
1,982
136
Maybe it seems impossible right now, but in 10 years it might be ordinary. Take something like the current Apple watch. If you go back far enough in time, it effectively becomes more powerful than any super computer at that time. The modern iPhone seems to be around the level of where a high-end desktop was about a decade ago.
 

guidryp

Golden Member
Apr 3, 2006
1,065
1,135
136
Maybe it seems impossible right now, but in 10 years it might be ordinary. Take something like the current Apple watch. If you go back far enough in time, it effectively becomes more powerful than any super computer at that time. The modern iPhone seems to be around the level of where a high-end desktop was about a decade ago.
IIRC, when I went to University, our campus was boasting about it's Supercomputer, with 16 MIPS, and 128 MB of RAM.

Mind you I just missed having classes with punch cards. :D
 
Last edited:
  • Like
Reactions: Tlh97 and Mopetar

Doug S

Senior member
Feb 8, 2020
504
702
96
Do you really think an iphone would be capable of that type of processing, or that you think pushing dual 8K over wireless would be feasible like that?
It is already capable of that level of processing, what makes you think an A14 is not up to the task when there are multiple examples of "similar" products that are less powerful?

Solutions for wireless HDMI have existed for a long time - dual 8K would require more power (wider frequency range) but its theoretically doable. However, I don't think that's necessary - there are a lot of steps in a GPU from the start of the pipeline to the HDMI output. Apple fully controls the GPU hardware and the OS/driver software, so they can design it to have part of the rendering pipeline living on the phone (doing the hard stuff) and part living in the wearable so the data traffic would be orders of magnitude less than dual 8K HDMI bit rate.

The iPhone's existing UWB implementation should be able to handle that (500 MHz band at 1024 QAM is ~ 5 Gbps) I mean, the wearable would be caching textures so it just needs primitives to tell it what to put down on the screen where, not to address every single pixel in 32 bit color.
 

Mopetar

Diamond Member
Jan 31, 2011
5,400
1,982
136
  • Like
Reactions: Tlh97 and scannall

coercitiv

Diamond Member
Jan 24, 2014
4,162
5,054
136
The raw numbers do seem unusually high, but if it only eats up ~.5% of rated lifetime per month, it will still last until ~2037 assuming that the issue remains unfixed.
So far people documenting it more seriously have found a few nasty cases, some of which have already gone past 100TBW. The worst known case is 10% in 2 months. (see thread here)

Unfortunately information is limited, people can't even trust the reporting tools.
 

jeanlain

Member
Oct 26, 2020
82
50
51
It is already capable of that level of processing, what makes you think an A14 is not up to the task when there are multiple examples of "similar" products that are less powerful?
What phone is capable to dual 8k rendering? :oops: The A14 struggles to render any demanding game at less than native resolution. Also, consider that you need at least 90 fps for VR.
Solutions for wireless HDMI have existed for a long time - dual 8K would require more power (wider frequency range) but its theoretically doable. However, I don't think that's necessary - there are a lot of steps in a GPU from the start of the pipeline to the HDMI output. Apple fully controls the GPU hardware and the OS/driver software, so they can design it to have part of the rendering pipeline living on the phone (doing the hard stuff) and part living in the wearable so the data traffic would be orders of magnitude less than dual 8K HDMI bit rate.

The iPhone's existing UWB implementation should be able to handle that (500 MHz band at 1024 QAM is ~ 5 Gbps) I mean, the wearable would be caching textures so it just needs primitives to tell it what to put down on the screen where, not to address every single pixel in 32 bit color.
That may work if you only want to achieve this. But in the real world, texture streaming over a connection that is orders of magnitude slower than PCIe might be a problem. And shading is done on a per-pixel basis.
 

Doug S

Senior member
Feb 8, 2020
504
702
96
What phone is capable to dual 8k rendering? :oops: The A14 struggles to render any demanding game at less than native resolution. Also, consider that you need at least 90 fps for VR.

That may work if you only want to achieve this. But in the real world, texture streaming over a connection that is orders of magnitude slower than PCIe might be a problem. And shading is done on a per-pixel basis.
Who says Apple is doing full VR? I think that's very unlikely, they will instead do some type of AR that overlays the 'real world' you see through the glasses. That doesn't require rendering two full 8K scenes, just some elements of the overlay.

You need the resolution (so it doesn't look blocky amid the real world) and you need a high frame rate (so it doesn't look jumpy when you move your eyes/head) but creating a full scene is orders of magnitude more computationally intensive than doing overlays. Anyway, just because the iPhone doesn't output 8K doesn't mean its GPU couldn't handle rendering one or two 8K scenes - it just couldn't come close to resolution AND maintain kind of scene complexity a modern video game wants. If you wanted something simple like Angry Birds in 8K, I'm sure the GPU could do it with some minor changes (bigger frame buffer and HDMI 2.1 to actually output in 8K)

Apple designs stuff for the mass market, not the niche geek crowd that wants fully immersive HFR 8K VR and wouldn't care if they had to strap on a five pound headset to get it. Not that I'm sure there's a mass market for AR, but I know there isn't one for VR at least not with anything remotely like today's technology.
 
  • Like
Reactions: scannall

ASK THE COMMUNITY