• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Question New Apple SoC - M1 - For lower end Macs - Geekbench 5 single-core >1700

Page 39 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Eug

Lifer
Mar 11, 2000
23,002
521
126
$699, $750, $799. One of these. I just said $700 because it's catchy. Anything from $700-$800 is what I consider as an extractive price to most would-be Windows laptop buyers.
I will go on record again to say I expect we may get an $899 MacBook Air eventually, and we won't get a $699 MacBook Air in the next few years.

I also think $799 is a remote possibility, but that's a whopping 14% more than your original $700 statement. That's quite the revised goalpost there if now all of a sudden you're saying you actually mean $800.

Don't need to. It's a $100 Homepod.
My point was I don't know if people were guessing $100 Homepods or now because I don't follow them.

But people guessing $399 SEs in 2020? Check.
People guessing $329 A12 in 2020? Check.
 

Mopetar

Diamond Member
Jan 31, 2011
5,049
1,554
136
I'm always cautious when benchmarks involve exporting to H.264 and H.625, as the results heavily depend on the dedicated encoding hardware, which the Mac Pro may not have (at least, it doesn't have quick sync). Sure, in the end the M1 can be faster than the Mac Pro, but this doesn't necessarily reflect the performance of the M1 CPU cores.
I use my current MBP to do a lot of video work, so honestly the encoding performance is really important for what I'm using it for and the casual web use and other stuff it gets used for isn't nearly as taxing, so I could see myself upgrading once they release one with a larger screen. It'll probably cost a lot less than the old one as well which makes it an easier pill to swallow.
 

senttoschool

Golden Member
Jan 30, 2010
1,501
195
106
I will go on record to say I expect an $899 MacBook Air eventually, and we won't get a $699 MacBook Air in the next few years.

I think $799 is a remote possibility, but that's a whopping 14% more than your original $700 statement. That's quite the revised goalpost there.
Nah. A $899 MBA doesn't do it. It barely makes a difference. No one will bat an eye. It's like if Apple priced the iPhone SE $100 cheaper than the iPhone 12 Mini ($700). No one will care about the SE if it's priced at $600.

Now a $700 or $750 MBA? That's something.
 

Denly

Golden Member
May 14, 2011
1,143
108
106
I think the key part of their sucess is that they don't actually own any factory. Manufacturing specialized good such as OLED panels, semiconductors, batteries, etc, require extreme capital investment and depend highly on volume to be profitable. Its really much cheaper to let third parties R&D themselves and set up the production lines and always try to keep at least two suppliers so you can negotiate the best price. While designing a chip is expensive, it comes nowhere close the cost of having a dedicated foundry, they'd be fools to try that. Even if it evers comes to a point where TSMC is a monopoly and charges absurd quantities on their nodes, it would still be cheaper to fund competitors rather than to develop a foundry by itself. Kind of like they did with LG displays, but with ideally more sucess than that.
I think the key part of their sucess is that they don't actually own any factory. Manufacturing specialized good such as OLED panels, semiconductors, batteries, etc, require extreme capital investment and depend highly on volume to be profitable. Its really much cheaper to let third parties R&D themselves and set up the production lines and always try to keep at least two suppliers so you can negotiate the best price. While designing a chip is expensive, it comes nowhere close the cost of having a dedicated foundry, they'd be fools to try that. Even if it evers comes to a point where TSMC is a monopoly and charges absurd quantities on their nodes, it would still be cheaper to fund competitors rather than to develop a foundry by itself. Kind of like they did with LG displays, but with ideally more sucess than that.
I meant no Tech company in the last 20yrs - except maybe Samsung not sure if that count as Samsung do lot more than just tech - managed to do SW/HW design/silicon/ under one roof and not f it up. I can think of company did 2 things well but not all 3, and I am just using the general term like HW/SW.
 

nxre

Junior Member
Nov 19, 2020
15
16
36
For the people who wished for a cheaper Mac from Apple they just tore your dreams apart on this new interview
(And FYI: we also asked if Apple plans to introduce cheaper Macs, on the assumption that using its own silicon is more economical. "We don't do cheap—you know that," Joswiak admitted. "Cheap is for other people, because we try to build a better product.")
 

Eug

Lifer
Mar 11, 2000
23,002
521
126
Nah. A $899 MBA doesn't do it. It barely makes a difference. No one will bat an eye. It's like if Apple priced the iPhone SE $100 cheaper than the iPhone 12 Mini ($700). No one will care about the SE if it's priced at $600.

Now a $700 or $750 MBA? That's something.
You really should make a prediction and stick to it.

You said $700. I said not a chance.

Then you revised your statement to say $800 is what you actually meant but you said $700 because $700 is more catchy.

Geez man, seriously?

If you have to revise your statement then just revise your statement instead of making up some sort of lame excuse thinking it's gonna give you a free credibility pass. Cuz it doesn't.
 

senttoschool

Golden Member
Jan 30, 2010
1,501
195
106
You really should make one prediction and stick to it.

You said $700. I said not a chance.

Then you revised your statement to say $800 is what you actually meant but you said $700 because $700 is more catchy.

Geez man, seriously?
$700. $750. $800. $700 is what I think makes the most sense if I'm running Apple. $750 is probably what they will do. $800 is borderline meh.

For the people who wished for a cheaper Mac from Apple they just tore your dreams apart on this new interview

Apple will never make something that is "cheap", which is what he was saying. They're never going to go to the low end. They will always be above the low-end.
 

Eug

Lifer
Mar 11, 2000
23,002
521
126
Weird. I just saw on MacRumors that Apple has logic boards listed for M1 Mac minis with 10 GigE, for repair shops.


I wonder if these will eventually prove to be unavailable, given that you can't actually order M1 Mac minis with 10 GigE.

Anyhow, I had though this was going to be a differentiating feature. They were going to have the M1 Mac mini with Gigabit Ethernet, and then later sell a higher end model with the 10 GigE option and more ports.
 

guidryp

Senior member
Apr 3, 2006
590
465
136
Here's the thing. Hardware accelerated encoding is very nice for real time or performance sensitive encoding. If I were going to stream a game on YouTube, I would certainly choose NVENC over anything else. But for movies, I would choose a software solution because if you want maximum quality and flexibility, offline encoding is the best.

It's similar to why offline rendering is still usually done on CPUs, even to this day.



They said this in that article:
One thing to keep in mind is that our testing was done with "VBR, 1 pass" since hardware encoding currently doesn't support 2 pass encoding. If you were to use 2 pass encoding with software encoding, the quality difference would be a bit more pronounced (although it would also take significantly longer).
Comparing 2-Pass to CBR, is a nonsense comparison. These days they only reason to use CBR, is for streaming, and obviously you can't use 2-pass for streaming. It sounds like he doesn't really understand encoding.

For offline encoding you do VBR. VBR is superior to CBR for what should be obvious reasons. There are two ways to do VBR:

a) Constant Quality. The most used VBR method. Usually indicated CR/CRF/CQ. This sets a quality target and varies data rate to achieve that quality as it goes. The HW encoders typically support constant quality.
b) 2 Pass is also VBR, it just runs the whole file through once to determine where to spend a fixed budget. This was big thing years ago when people insisted on cramming DVD rips on CD-Roms. But has little other use other than fixed sizing to place on optical disks.

If you do a Constant Quality run first, then a 2 Pass targeting the same file size, the quality will be the same. So Constant quality achieves the same quality as 2 Pass, but does it faster, which is why 2 Pass is almost never used by people who understand what is going on anymore.

On the streaming side, I have seen a lot of comparison since Turing's upgrade to NVenc. The consensus for streaming is Use NVenc if you have NVidia, x264 if you don't. HW encoders are not created equal. But right now for streaming. NVenc is so close to x264, that if you have a NVidia cards, it's the streaming method of choice. Intel is next in line, but considered inferior to NVenc/x264. AMD is considered dead last for HW H.264 streaming.

Unfortunately, you don't see much coverage of offline quality of HW encoders, and little about Apple HW encoding.

Hopefully as M1 Macs roll-out we will see some more detailed analysis.
 
  • Like
Reactions: Carfax83

gdansk

Senior member
Feb 8, 2011
523
211
116
Honestly the battery life is really impressive. If I didn't buy a MBP16 last year I'd probably pick up an Air.
Anyone want to buy a heavily discounted Intel flamethrower? :p
 

name99

Senior member
Sep 11, 2010
385
290
136
For the people who wished for a cheaper Mac from Apple they just tore your dreams apart on this new interview

Maybe you are not a native English speaker? Or even from a different English speaking community within the US?
Cheap is a somewhat ambiguous English word. People think it means "low price" but it really means something like "low quality"; since low price and low quality usually go together you can see how the ambiguity arises...

Apple don't have a problem with low-priced items. But they aren't willing to sacrifice quality (make it "cheap") to hit that lower price; things like a lesser battery or a nasty screen, the ways the $400 laptops hit their price points.
 

Doug S

Senior member
Feb 8, 2020
404
579
96
Weird. I just saw on MacRumors that Apple has logic boards listed for M1 Mac minis with 10 GigE, for repair shops.


I wonder if these will eventually prove to be unavailable, given that you can't actually order M1 Mac minis with 10 GigE.

Anyhow, I had though this was going to be a differentiating feature. They were going to have the M1 Mac mini with Gigabit Ethernet, and then later sell a higher end model with the 10 GigE option and more ports.
The M1 Mac Mini is the "low end" Mini. They still have to sell the high end Mini, which will presumably come next year with the 8+4 chip. Maybe they plan to offer 10GbE as an option for that? I mean, connect a disk array to it via TB and give it 10 GbE and you've got yourself a departmental NAS.
 

Eug

Lifer
Mar 11, 2000
23,002
521
126
The M1 Mac Mini is the "low end" Mini. They still have to sell the high end Mini, which will presumably come next year with the 8+4 chip. Maybe they plan to offer 10GbE as an option for that? I mean, connect a disk array to it via TB and give it 10 GbE and you've got yourself a departmental NAS.
Yes, the 8+4 Mac mini next year will likely get the 10 GigE option, but the 10 GigE mobos in question are specifically 8-core (aka 4+4) models, and are available for order right now.

m1-mac-mini-10gb-ethernet-parts-list.jpg


---

Chrome native for M1 is much, much faster than the Intel version.

 

amrnuke

Senior member
Apr 24, 2019
999
1,506
96
TDP ≠ Power Draw, we've gone over this discussion numerous times.
Where in my post did I say that TDP = Power Draw?
My point was that it is absurd to compare a peak performance desktop core to what's been optimized for laptops.
 

Eug

Lifer
Mar 11, 2000
23,002
521
126
Judging by the various reviews out there, M1 is not really built to handle 8K video footage or some other types of >4K footage. It works, but the jerkiness in the timeline plaguing other Macs makes an appearance quite often. According to SoC monitoring software, it often maxes out the "GPU" although I'm not entirely sure what that means. Don't ask me what GPU monitor applet it is cuz I don't know, but I do note that Activity Monitor includes a GPU usage monitor as well. In contrast, the CPU cores aren't breaking a sweat.

n00b question:

Would it really be just the GPU, or does that include that image signal processor outside the GPU and CPU? If it truly is GPU, then increasing the number of GPU cores would likely solve the problem. However, if it's the dedicated ISP outside the GPU, then they'd have to augment that somehow in order to deal with this issue.

IOW, should M1X be more CPU cores + more GPU cores with the same existing ISP, or should the ISP also get a boost?
 

moinmoin

Platinum Member
Jun 1, 2017
2,074
2,485
106
AMD CPUs are in a fair number of laptops, but IMO most of them are junk. They seem to lean towards the market with bad battery life and annoying international keyboards, at least around here. All the nicer WIndows laptops with good battery life seem to be Intel.

Maybe that will change with the new cores, but in the past AMD CPU in a laptop = red flag. I wonder how long it will take for AMD to shake that reputation.
While you are spot on with how laptop OEMs treated AMD chips in the past, this actually took a 180° turn this past spring with Ryzen Mobile 4000/Renoir which reviewed unexpectedly well and since has had way more demand from OEMs and consumers than supply available. Everything you wrote is already in the past as of now.

You got videos like this one at the launch (first 1:20min are sufficient for a first impression):

As I wrote before it's a good thing for competition that Apple joins and surpasses the laptop competition with M1 since otherwise AMD could now lay back with the progress it has been making.

(The real story in there is how much Intel has been dropping balls in the last couple years.)
 
Last edited:
  • Like
Reactions: teejee and Tlh97

name99

Senior member
Sep 11, 2010
385
290
136
Judging by the various reviews out there, M1 is not really built to handle 8K video footage or some other types of >4K footage. It works, but the jerkiness in the timeline plaguing other Macs makes an appearance quite often. According to SoC monitoring software, it often maxes out the "GPU" although I'm not entirely sure what that means. Don't ask me what GPU monitor applet it is cuz I don't know, but I do note that Activity Monitor includes a GPU usage monitor as well. In contrast, the CPU cores aren't breaking a sweat.

n00b question:

Would it really be just the GPU, or does that include that image signal processor outside the GPU and CPU? If it truly is GPU, then increasing the number of GPU cores would likely solve the problem. However, if it's the dedicated ISP outside the GPU, then they'd have to augment that somehow in order to deal with this issue.

IOW, should M1X be more CPU cores + more GPU cores with the same existing ISP, or should the ISP also get a boost?
I think we're going to see, over time, some fascinating changes here.
NPU, GPU, ISP, even media encoder, all, from the thousand foot level, perform the same sort of thing --- many many MANY small engines all operating in parallel on (hopefully) independent pieces of data.

How valuable will it remain to keep these separated (and, of course, slightly more optimized for each particular task) rather than creating a single super-throughput engine that has the specialist capabilities of an NPU (small integer multiplication mainly?), of a GPU (texture lookup and things like that), of a media encoder (I assume mainly the ability to compare slightly shifted pixel blocks against each other to find best matches), of an ISP (no idea what those are!) available to each unit; but all based on a common framework of registers, instruction scheduling, synchronization, cache, and so on?

There an obvious win to this (the system can dynamically expand performance to whatever is the current task -- your GPU is 2x as fast if it can recruit all the computation latent in the NPU, ISP, and h.265 encoder!). And it's designing ONE thing rather than many, and writing more common code.
And there's a cost. More generality means more power usage, and spreading usage this way may require slightly more area.

Is it an overall win? My guess is it could actually be, once someone has had time to figure out the optimal set of common primitives. But will it happen soon (soon means, say, before 2025)? Not a clue! I have no idea what the degree of commonality between these blocks is once you start drilling into the details.
 

Eug

Lifer
Mar 11, 2000
23,002
521
126
While you are spot on with how laptop OEMs treated AMD chips in the past, this actually took a 180° turn this past spring with Ryzen Mobile 4000/Renoir which reviewed unexpectedly well and since has had way more demand from OEMs and consumers than supply available. Everything you wrote is already in the past as of now.

You got videos like this one at the launch (first 1:20min are sufficient for a first impression):

As I wrote before it's a good thing for competition that Apple joins and surpasses the laptop competition with M1 since otherwise AMD could now lay back with the progress it has been making.

(The real story in there is how much Intel has been dropping balls in the last couple years.)
First off, the guy throwing the laptop for Linus to catch actually made me physically cringe. They put fear into me that he would drop it. :oops:

Good to know about the recent AMD laptops, but my bias may be showing here, because all my colleagues are in the business crowd and are heavy travellers (Covid notwithstanding), so I've been steering them toward ultrabooks, which they always love, so more in the 15 Watt TDP tier. How well has AMD they been doing in the market there? I know the machines exist, but I haven't seen a ton of them locally. Then again since the beginning of Covid I haven't been doing a lot of in-person laptop shopping.
 

moinmoin

Platinum Member
Jun 1, 2017
2,074
2,485
106
First off, the guy throwing the laptop for Linus to catch actually made me physically cringe. They put fear into me that he would drop it. :oops:

Good to know about the recent AMD laptops, but my bias may be showing here, because all my colleagues are in the business crowd and are heavy travellers (Covid notwithstanding), so I've been steering them toward ultrabooks, which they always love, so more in the 15 Watt TDP tier. How well has AMD they been doing in the market there? I know the machines exist, but I haven't seen a ton of them locally. Then again since the beginning of Covid I haven't been doing a lot of in-person laptop shopping.
Renoir is eating Intel for breakfast even in the ultrabooks segment.


Availability is a major issue since apparently nobody (neither AMD nor the OEMs) expected the demand it'd cause.
 

Eug

Lifer
Mar 11, 2000
23,002
521
126
Do note that AMD reported the "highest ever quarterly mobile processor unit shipments and revenue" in its last financial results. And according to Mercury Research AMD reached an all time high in mobile (i.e. laptop) market share of 19.9% earlier this year.
Wouldn't that suggest the laptops are ramping up more this quarter, as opposed to last quarter?

EDIT:

Yes it does:

AMD's mobile processor sales doubled YoY, and another 30+ models will join the 50+ notebooks on the market by the end of the year. That means AMD has a solid runway for more growth.
 

moinmoin

Platinum Member
Jun 1, 2017
2,074
2,485
106
Wouldn't that suggest the laptops are ramping up more this quarter, as opposed to last quarter?

EDIT:

Yes it does:

AMD's mobile processor sales doubled YoY, and another 30+ models will join the 50+ notebooks on the market by the end of the year. That means AMD has a solid runway for more growth.
I would have grown even faster earlier if it weren't for shortages. It was common sight that OEMs announced laptop models which they were then unable to ship with shipment date delayed further and further out as they didn't get the necessary processors. There were even some cases of models getting altogether canceled again.

Anyway all this is pretty off topic in this thread. Point is AMD in laptops is in a huge upswing right now. Apple won't have any troubles selling its M1 Macs either. The big loser in both cases is Intel.
 

ASK THE COMMUNITY