Originally posted by: Nothinman
Just like a car without brakes is still car. Hell, IIRC years ago NT4 qualified for some level of unix certification because it had a working posix subsystem.
Sadly the "UNIX" trademark that you get from the Open Group is just something that you buy and isn't really worth anything. They're about as relevant as the XFree86 people these days.
As far as I know NT4 was never Unix certified. It was POSIX certified however(albeit an old version, and without access to the Win32 APIs).
I don't think the UNIX trademark is as useless as you consider it though. If it was just a matter of money, there would be more than a handful of UNIX'03 certified systems out there. Meanwhile everyone else but Microsoft chases UNIX, even if they can't get certified.
And along those same lines what was the point of Quartz? X11 worked and still works. Back in the late 90s when they started this there wasn't direct rendering for OpenGL but it's not like Macs are a big gaming platform.
Didn't we just go over this? The point of Quartz was to be a better windowing system than X11. And bear in mind that OpenGL was a big part of that - Mac OS X 10.2 was released a little more than a year after 10.0, at which point it added Quartz Extreme for GPU compositing, for which OpenGL would have been essential. Quartz Extreme was in the plan for a long time, even if it wasn't ready until 2002.
You just named 2 fairly major things wrong with it, ACLs less since Macs are usually single-user though. And according to wikipedia ext3 was started in 1998 and actually merged with the mainline kernel in 2001. And MS was shipping NT3.1 with NTFS back in 1993.
This of course was all back in March of 2001; ext3 didn't launch until November, according to ye' olde Wikipedia. They wouldn't have been able to launch with ext3 even if they had wanted to go that direction. And NTFS only got journaling back in 2000. Instead they added journaling to 10.2 (though I guess they could have copied MS and done it for 10.0 if need-be), I'm not immediately sure when ACLs were added to Mac OS X though(or Linux for that matter).
The point is just that in so many ways they used what was "good enough" from the previous system and added on as they went, but for some reason decided X wasn't worth their time.
It strikes me that if they take bits that are "good enough" and they don't take X, then X probably isn't "good enough". If they just wanted to break compatibility like you attest to, then they could have easily made an X knock-off with enough changes to break compatibility without creating so much work for themselves.
We can go over this again and again, but ultimately X was in bad shape in 1999 when they had NextStep's Display Postscript to build a successor from. The only other thing I can do is point out
this excellent Slashdot post from 2003. It's Mike Paquette, the designer of Quartz, explaining why they didn't use X.