macOS Mojave due in Fall 2018

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
51,125
6,970
136
https://www.apple.com/macos/mojave-preview/

* Developer preview now available
* New dark mode
* Apple is allowing iOS apps to be ported in to macOS (through their respective developers)
* iOS & macOS will NOT be merging
* No mention of ARM

Dark Mode looks awesome!
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
I was hoping for 4K HDR video streaming but nothing was mentioned for that. :(
 

Kaido

Elite Member & Kitchen Overlord
Feb 14, 2004
51,125
6,970
136
I was hoping for 4K HDR video streaming but nothing was mentioned for that. :(

Although interestingly enough, they're adding Dolby Atmos support to the Apple TV...
 

Commodus

Diamond Member
Oct 9, 2004
9,215
6,820
136
I was hoping for 4K HDR video streaming but nothing was mentioned for that. :(

That would require a compatible screen, and I suspect Apple won't talk about that kind of support until it's making Macs with HDR-ready displays.
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
That would require a compatible screen, and I suspect Apple won't talk about that kind of support until it's making Macs with HDR-ready displays.
Nah. The iMacs and MacBook Pros already have wide colour gamut 8-bit+FRC screens (which some advertise as “10-bit”). That’s about as close as we are going to get to true 10-bit screens in mainstream Macs in the foreseeable future. Furthermore, Apple already advertises these as supporting a billion colours.
 

Commodus

Diamond Member
Oct 9, 2004
9,215
6,820
136
Nah. The iMacs and MacBook Pros already have wide colour gamut 8-bit+FRC screens (which some advertise as “10-bit”). That’s about as close as we are going to get to true 10-bit screens in mainstream Macs in the foreseeable future. Furthermore, Apple already advertises these as supporting a billion colours.

HDR actually isn't so much about color gamut as it is peak brightness, since you need that to offer the full range of imagery. Right now, the iMac tops out at 500 nits of brightness. It wouldn't be completely incapable of HDR, but it'd be pretty weak (the lowest DisplayHDR spec is 400). You'd really want 600 or 1,000 nits of brightness to do HDR properly, especially on machine that are frequently used by pro creators.
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
HDR actually isn't so much about color gamut as it is peak brightness, since you need that to offer the full range of imagery. Right now, the iMac tops out at 500 nits of brightness. It wouldn't be completely incapable of HDR, but it'd be pretty weak (the lowest DisplayHDR spec is 400). You'd really want 600 or 1,000 nits of brightness to do HDR properly, especially on machine that are frequently used by pro creators.
HDR is about both, but overall it doesn't really matter that much when we are talking mainstream machines. The OS already supports 10-bit 4K HEVC decode in hardware, and Apple is already pushing 10-bit 4K in its existing machines. Whether or not they call it HDR is besides the point. So what if it's not true HDR? 4K 10-bit support is already a big step forward.

IOW, the issue here is not about whether it's true 1000 nits brightness. It's about Apple's willingness to implement 4K DRM into the OS. IMO, they should, since the hardware already supports it, and they already have a big library of 4K HDR content. Now they just need the software support, and that support already exist on the Windows side, again, without any need to specify true HDR brightness, etc. Furthermore, nowhere in the Apple TV documentation does it say you must use it with a 1000 nit TV.

To put it another way, if you think we must have 1000 nit iMacs in order to get 4K streaming support, you are basically saying macOS will not get 4K streaming support in the next 5 years.
 
Last edited:

Triloby

Senior member
Mar 18, 2016
587
275
136
Looks like support for both OpenGL and OpenCL are deprecated for macOS starting on version 10.14 (bottom of the page):

https://developer.apple.com/macos/whats-new/

This could cause problems for those that use Adobe on macOS. This is also going to cause issues with future games being released on macOS. Then again, Mac was never a serious gaming platform to begin with.
 

Commodus

Diamond Member
Oct 9, 2004
9,215
6,820
136
HDR is about both, but overall it doesn't really matter that much when we are talking mainstream machines. The OS already supports 10-bit 4K HEVC decode in hardware, and Apple is already pushing 10-bit 4K in its existing machines. Whether or not they call it HDR is besides the point. So what if it's not true HDR? 4K 10-bit support is already a big step forward.

IOW, the issue here is not about whether it's true 1000 nits brightness. It's about Apple's willingness to implement 4K DRM into the OS. IMO, they should, since the hardware already supports it, and they already have a big library of 4K HDR content. Now they just need the software support, and that support already exist on the Windows side, again, without any need to specify true HDR brightness, etc. Furthermore, nowhere in the Apple TV documentation does it say you must use it with a 1000 nit TV.

To put it another way, if you think we must have 1000 nit iMacs in order to get 4K streaming support, you are basically saying macOS will not get 4K streaming support in the next 5 years.

I don't think we need 1,000-nit iMacs, but we'd ideally have iMacs (and possibly MacBook Pros) that are more representative of what you'll actually get when you're watching HDR on a TV. You need to know if that highlight might be too blinding or if that shadowed area might be underexposed.

And I didn't say we needed this for 4K streaming support, just that an iMac as you know it today isn't really an HDR machine. Apple could add HDR support to Mojave today, but it wouldn't matter as you would have trouble noticing the difference.
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
I don't think we need 1,000-nit iMacs, but we'd ideally have iMacs (and possibly MacBook Pros) that are more representative of what you'll actually get when you're watching HDR on a TV. You need to know if that highlight might be too blinding or if that shadowed area might be underexposed.

And I didn't say we needed this for 4K streaming support, just that an iMac as you know it today isn't really an HDR machine. Apple could add HDR support to Mojave today, but it wouldn't matter as you would have trouble noticing the difference.
It wouldn't be that hard to notice the difference, due to the wide colour gamut support. As mentioned, HDR isn't just about brightness.
 

sweenish

Diamond Member
May 21, 2013
3,656
60
91
BUT THEY GO HAND IN HAND. One without the other doesn't get the full effect. Without the full effect, you can't know that your edits and adjustments are actually correct. This is not complicated.
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
BUT THEY GO HAND IN HAND. One without the other doesn't get the full effect. Without the full effect, you can't know that your edits and adjustments are actually correct. This is not complicated.
What are you talking about? The Macs already ship with these monitors, and the OS already supports HDR.

If you are doing serious editing and want a true HDR monitor, sure, you can just get an true 10-bit external monitor, but in reality, most editing 4K on these iMacs don't. They're not generally using stock iMacs to edit Hollywood movies after all.

However, I'm talking about DRM'd 4K movie streaming, so your post doesn't even make sense. It really doesn't matter what monitor you have. It can even be SDR as long as it's dithered properly. However, the iMacs have greater than usual brightness, and it has wide colour gamut, so watching a 4K HDR stream on a 2017 5K iMac would be a significant improvement over older technology.

If the TV companies were as purist as you two guys seem to be, about 90% of 4K HDR TVs out there would have to be thrown in the garbage.

I just want to be able to watch 4K streams, some of which I've already paid for, on my iMac, but I can't, because Apple hasn't yet incorporated the DRM support in the OS. The ironic part is you CAN already do this on an iMac or MacBook Pro, but you have to install Windows 10 to do it.

The other thing I will point out again is that I can ALREADY decode 10-bit HDR material on my iMac, because Apple has built in the support into High Sierra already. 10-bit HDR HEVC is fully hardware accelerated in both High Sierra and Mojave, and even works on my 12" SDR MacBook. The problem is the DRM, which so far Apple does not support.

---

tl;dr:

Those who say Apple won't be supporting 4K HDR video streaming because the monitors are not true HDR are barking up the wrong tree, because Apple ALREADY supports 4K HDR decode on Macs, and that support is built right into the OS.

The main thing that is preventing 4K streaming right now is 4K DRM support. The hardware is all there, but macOS doesn't support it, even though Windows 10 on the exact same Macs works just fine for this.

I can only hope Apple will release this on some version of Mojave in the future, say after the 2018 iMacs and MacBook Pros come out.
 
Last edited:

Ichinisan

Lifer
Oct 9, 2002
28,298
1,235
136
I thought it was hilarious when they announced the name. Does anyone remember Windows Mojave?
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
macOS 10.14 supports:
  • MacBook (Early 2015 or newer)
  • MacBook Air (Mid 2012 or newer)
  • MacBook Pro (Mid 2012 or newer)
  • Mac mini (Late 2012 or newer)
  • iMac (Late 2012 or newer)
  • iMac Pro (2017)
  • Mac Pro (Late 2013, plus mid 2010 and mid 2012 models with recommend Metal-capable GPU)
https://9to5mac.com/2018/06/04/macos-10-14-mojave-supported-macs/
Looks like support for both OpenGL and OpenCL are deprecated for macOS starting on version 10.14 (bottom of the page):

https://developer.apple.com/macos/whats-new/

This could cause problems for those that use Adobe on macOS. This is also going to cause issues with future games being released on macOS. Then again, Mac was never a serious gaming platform to begin with.
This also causes problems with owners of older Macs wanting to hack Mojave onto them. In Mojave, my understanding is the UI is now Metal accelerated. Therefore those who have managed to get Mojave onto their 2011 Macs are not getting any OS UI acceleration. This effectively makes it useless on those old Macs.

There is a little bit of good news though: The guys who write the hacks say the OpenGL renderer is still actually there in Mojave, but it's been turned off permanently. They are trying to turn it back on, and if they can, that means that 2009 iMacs and 2011 MacBook Pros might be able to run Mojave.

I'm not going to bother. I'm going to keep my 2008 MacBook, 2009 MacBook Pro, and 2010 iMac all on High Sierra. The two laptops are not officially supported but work fine on High Sierra, and the iMac is fully supported on High Sierra. I'm happy enough, since High Sierra gives me HEIF/HEVC and APFS compatibility, and they'll get security updates probably until 2020.
 

sweenish

Diamond Member
May 21, 2013
3,656
60
91
What are you talking about? The Macs already ship with these monitors, and the OS already supports HDR.

If you are doing serious editing and want a true HDR monitor, sure, you can just get an true 10-bit external monitor, but in reality, most editing 4K on these iMacs don't. They're not generally using stock iMacs to edit Hollywood movies after all.

However, I'm talking about DRM'd 4K movie streaming, so your post doesn't even make sense. It really doesn't matter what monitor you have. It can even be SDR as long as it's dithered properly. However, the iMacs have greater than usual brightness, and it has wide colour gamut, so watching a 4K HDR stream on a 2017 5K iMac would be a significant improvement over older technology.

If the TV companies were as purist as you two guys seem to be, about 90% of 4K HDR TVs out there would have to be thrown in the garbage.

I just want to be able to watch 4K streams, some of which I've already paid for, on my iMac, but I can't, because Apple hasn't yet incorporated the DRM support in the OS. The ironic part is you CAN already do this on an iMac or MacBook Pro, but you have to install Windows 10 to do it.

The other thing I will point out again is that I can ALREADY decode 10-bit HDR material on my iMac, because Apple has built in the support into High Sierra already. 10-bit HDR HEVC is fully hardware accelerated in both High Sierra and Mojave, and even works on my 12" SDR MacBook. The problem is the DRM, which so far Apple does not support.

---

tl;dr:

Those who say Apple won't be supporting 4K HDR video streaming because the monitors are not true HDR are barking up the wrong tree, because Apple ALREADY supports 4K HDR decode on Macs, and that support is built right into the OS.

The main thing that is preventing 4K streaming right now is 4K DRM support. The hardware is all there, but macOS doesn't support it, even though Windows 10 on the exact same Macs works just fine for this.

I can only hope Apple will release this on some version of Mojave in the future, say after the 2018 iMacs and MacBook Pros come out.

Yeah, how dare I want to satisfy all the requirements of a spec.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Looks like support for both OpenGL and OpenCL are deprecated for macOS starting on version 10.14 (bottom of the page):

https://developer.apple.com/macos/whats-new/

This could cause problems for those that use Adobe on macOS. This is also going to cause issues with future games being released on macOS. Then again, Mac was never a serious gaming platform to begin with.

Does this mean, all Legacy OpenGL games for the Mac are effectively dead on Mojave?
 

secretanchitman

Diamond Member
Apr 11, 2001
9,352
23
91
What hardware are they dropping support for? No more C2D Macs right?

Pretty much. 2012 and higher machines only at this point with the exception of 2010/2012 Mac Pro with Metal capable GPUs.

The day Apple puts in 32GB of RAM and Nvidia GPUs in the 15" MBP, I will buy it. Otherwise I am more than happy with my late 2013 15" MBP.
 

sportage

Lifer
Feb 1, 2008
11,492
3,163
136
Any comment on the macOS Mojave update, for those that have?
I'm considering.
Next after Mojave will be Brick. Or was it Block?
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
Any comment on the macOS Mojave update, for those that have?
I'm considering.
No Photos update. :( I used to upgrade sooner rather than later to take advantage of updates to Photos.

Mojave looks bad on non-Retina screens, since they dropped sub-pixel rendering. However, you can turn sub-pixel rendering back on with a Terminal command.

For some reason, my 2017 iMac with Mojave isn't automatically sleeping consistently. So, there are bugs to be worked out.

I like some of the of the new UI features, for example, where files of the same type can be bundled together on the desktop to save space.
 

sportage

Lifer
Feb 1, 2008
11,492
3,163
136
Its not working well.
I have a powerful i7 mac with 12 GB, and apps often hang or freeze requiring a force-quit.
Scrolling drives with a lot of data, 500 files or so, the scrolling hangs and stops.
Now the question is, revert back to high sierra or hope/wait for an fix update to Mojave ?
I'd expect the from MS Windows, not from apple.
 

Eug

Lifer
Mar 11, 2000
24,055
1,697
126
Its not working well.
I have a powerful i7 mac with 12 GB, and apps often hang or freeze requiring a force-quit.
Scrolling drives with a lot of data, 500 files or so, the scrolling hangs and stops.
Now the question is, revert back to high sierra or hope/wait for an fix update to Mojave ?
I'd expect the from MS Windows, not from apple.
I don't have this problem. Core i5 iMac 7600 with 1 TB SSD and 24 GB RAM.
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
Upgrading to Mojave pretty much broke iTunes cloud synchronization on my Mac. I no longer see new photos that were taken by my phone in Photos.