• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Microsoft gets the Linux jitters

VinDSL

Diamond Member
SOURCE: http://www.crn.com/software/219100164 (ChannelWeb - The Channel Wire)

Microsoft Admits That Linux Desktop Elephant Exists
It's no secret that Linux has had a major impact on Microsoft's server business, but Microsoft hasn't said much about the potential effects of desktop Linux. That changed recently when Microsoft admitted that Linux on desktops and notebooks poses competitive threats to its Windows client business [...]

EXTRA CREDIT: http://www.sec.gov/Archives/ed...9312509158735/d10k.htm (Microsoft's 10-K filing)
 
There's a couple of differences between servers and desktop that make it tougher for Linux desktops to compete with Windows desktops:

1) The price of Windows Server licenses and server applications are considerably higher than the desktop versions.

2) Most computer buyers are paying very little for Windows, since they purchase pre-built PCs with factory OEM installations. In XP, that cost has been said to be in the $25 range. With the Win7 pre-orders and the Family Pack, MS is hinting at some long-term reductions in prices for the desktop editions sold to consumers. If MS continues to offer $50 Win7 licenses, it's going to greatly reduce the perception that folks are paying a lot for Windows.

3) Server administrators are much more likely to be able to manage Linux systems than consumers.

With that said, with all of Microsoft's legal battles, I'm sure they don't want another one by neglecting to mention all competitive factors in their SEC filings.
 
Biggest threat imo is in the netbook market. These are small modest devices which are tailored to basic functionality like cruising the internet. In emerging markets people with limited wealth may purchase a linux based netbook.
 
1) The price of Windows Server licenses and server applications are considerably higher than the desktop versions.

And in many cases RHEL licenses are comparable or more than a Windows license. The only real difference is the CALs.

3) Server administrators are much more likely to be able to manage Linux systems than consumers.

I would says that "much more likely" is an exaggeration. I can count the number of people I know that have even minimal Linux knowledge on 1 hand.

Biggest threat imo is in the netbook market. These are small modest devices which are tailored to basic functionality like cruising the internet. In emerging markets people with limited wealth may purchase a linux based netbook.

End user wealth has nothing to do with it. On a device that costs $100-$200 paying even a $50 licensing fee for Windows is huge.

And in the end the OS will become a commodity. There's no reason that users should have to worry about the OS their device runs, they just care about the capabilities of the device and the apps they want to use. If Linux could run Windows and OS X apps as well as Windows and OS X, which is partially true because WINE works well in a lot of cases and tons of apps people run on OS X are ported from Linux, what would be the point of paying for Windows or OS X?

If Java had actually done it's job we might be at that point already, but we're not and it's not looking like desktop Java is going to do anything major. How ironic would it be if it was .Net/Mono that edged out Windows on the desktop?
 
This is a dumb story. Microsoft had something similar in their 2003 10-K, never mind that they clearly acknowledged this internally as early as 1998.

Yea, for them it's mainly a way to say "See? We don't have complete control over everything!".
 
The "cancer" isn't a laughing matter any more - or is it? :laugh:

SOURCE: http://www.itwire.com/content/view/26746/1231/ (iTWire - Microsoft blames open source for revenue fall)

Microsoft saw revenue fall to $US13.10 billion. This is the second straight quarter for which sales graphs have showed a downward trend.

While profits fell across its business, the biggest drop was in its client division - 33 percent.

The software giant made mention of GNU/Linux - what CEO Steve Ballmer once referred to as a cancer - as one of the factors in its 10-K filing to the US Securities and Exchange Commission [...]
 
I'd like to see Apple and Microsoft pick up Linux for the core of their operating systems. That way software compatibility is no longer an issue, and the companies compete on value.

Of course, the way both operate, they'd develop proprietary frameworks that only function on their versions of Linux anyway and require a complete redesign to be ported.
 
Of course, the way both operate, they'd develop proprietary frameworks that only function on their versions of Linux anyway and require a complete redesign to be ported.

Apple already did just that. They took Mach and FreeBSD, removed X and put their own display system on there so that X apps wouldn't work out of the box. They did end up shipping an X server but the look and feel and integration isn't there so as to keep you wanting to not use X apps.
 
Originally posted by: Nothinman
Of course, the way both operate, they'd develop proprietary frameworks that only function on their versions of Linux anyway and require a complete redesign to be ported.

Apple already did just that. They took Mach and FreeBSD, removed X and put their own display system on there so that X apps wouldn't work out of the box. They did end up shipping an X server but the look and feel and integration isn't there so as to keep you wanting to not use X apps.
You make it sound like they used a windowing system besides X11 just to spite a bunch of people.
 
Originally posted by: Nothinman
You make it sound like they used a windowing system besides X11 just to spite a bunch of people.

Besides lock-in what other reason did they have?
Is this some kind of joke? NextStep didn't even use X11, it had its own engine based on Display Postscript. The foundation of all this stuff was laid down nearly a decade before development of Mac OS X proper even began. What you see with Quartz is a heavily overhauled, albeit direct descendant of NextStep's windowing system.

I suppose Apple could have dropped what would become Quartz for X11, but doing so wouldn't have made any sense. Quartz was far ahead of where X11 was in 1999. Here's a good ArsTechnica article talking about Quartz as it was back in 2000 when Apple was first showing it off, just to give you an idea of where it stood compared to X11.

But in short, X11 sucked. If you had something better, why would you want to use it if you didn't need to? A separate X11 application allowed Mac OS X to increase its compatibility with traditional *nix applications without locking Mac OS X in to an inferior windowing system.
 
But in short, X11 sucked. If you had something better, why would you want to use it if you didn't need to? A separate X11 application allowed Mac OS X to increase its compatibility with traditional *nix applications without locking Mac OS X in to an inferior windowing system.

Because it's the standard. X11 has it's problems but it's what all unix systems except for OS X use. How about instead of creating intentional incompatibility they worked with Xorg to fix the problems that affected them? That's what many other vendors have decided to do and it actually benefits everyone, not just 1 company.

X11 is an afterthought for Apple, IIRC it wasn't installed by default until a release or two ago and I think it wasn't even on the disc until a release or two before that.

As much as Apple talks about their "most advanced operating system" that's "built on rock-solid UNIX" they sure as hell give the unix parts the back seat.
 
Originally posted by: Nothinman
But in short, X11 sucked. If you had something better, why would you want to use it if you didn't need to? A separate X11 application allowed Mac OS X to increase its compatibility with traditional *nix applications without locking Mac OS X in to an inferior windowing system.

Because it's the standard. X11 has it's problems but it's what all unix systems except for OS X use. How about instead of creating intentional incompatibility they worked with Xorg to fix the problems that affected them? That's what many other vendors have decided to do and it actually benefits everyone, not just 1 company.

X11 is an afterthought for Apple, IIRC it wasn't installed by default until a release or two ago and I think it wasn't even on the disc until a release or two before that.

As much as Apple talks about their "most advanced operating system" that's "built on rock-solid UNIX" they sure as hell give the unix parts the back seat.
I'm really not too sure how to respond to this.

You're talking about Apple, a company that has absolutely no problem throwing old technology aside when it comes to doing the next big thing. They threw Classic aside, then PPC, and Carbon. This is all just in the last decade. Apple's not a company that cares much for standards or backwards compatibility, beyond providing an immediate bridge to get their core users to the next big thing.

Just because X11 is a de-facto standard for *nix systems doesn't mean it's a good one. Much of what's in that old Ars article still applies today. The fastest way to bring X11 up to Quartz would have been to trash it, which would have accomplished the same thing. Certainly trying to shoehorn everything Quartz can do in to X11 would have taken many years longer.

XFree86 development was notoriously closed and slow, which is why it forked in to X.org in 2004. Apple probably would have needed to fight every step of the way just to get half their changes through.

I don't mean to knock on X11, because that's not what this is supposed to be about. But I have to reiterate my point: why would you want to use X11 when you had something better? There's no sense in keeping yourself from building a better operating system just because it requires breaking a few eggs.

Your complaints strike me as being less about the fact that Apple didn't use X11, and more about the fact that they didn't open source the entire OS and give it away. And that's an entirely different argument.
 
Originally posted by: RebateMonger
Also, how can a product (Linux) that only has a two-percent market share cause a 33-percent drop in sales of Windows? It doesn't add up.
Maybe someone was lying! 😀

Might be more than two-percent - you think?

EDIT

Hrm...

I was just looking the AT System Rig Stats:

http://www.anandtech.com/systemrig_stats.aspx

Looks like 10.4% to me - and this ain't exactly a Linux hippy hangout, you know?
rose.gif
 
Originally posted by: VinDSL
Originally posted by: RebateMonger
Also, how can a product (Linux) that only has a two-percent market share cause a 33-percent drop in sales of Windows? It doesn't add up.
Might be more than two-percent - you think?
Look harder. Market surveys of desktop OSes all say pretty much the same thing:

PC 89%
Mac 10%
Linux 1%

Examples:
Datamation
Linuxhelp.blogspot.com (no source shown, note this is a Linux-friendly site)

Netbook Linux market share started at near-100%, and is now at 10%.

I was just looking the AT System Rig Stats:
http://www.anandtech.com/systemrig_stats.aspx
Surveys of AnandTech users have nothing to do with the real world of PC use. This survey implies that Macintosh market share is less than 1%. In 25 years, I've never SEEN a Linux or Unix desktop PC (other than my own). But I've seen lots of Macs.
 
Originally posted by: RebateMonger
Originally posted by: VinDSL
SOURCE: http://www.itwire.com/content/view/26746/1231/ (iTWire - Microsoft blames open source for revenue fall)
Or, it could be because we've been in a severe recession for two years. Computer replacements are a luxury item for consumers.

Also, how can a product (Linux) that only has a two-percent market share cause a 33-percent drop in sales of Windows? It doesn't add up.

It's hard to quantify Linux's actual marketshare, given how most Linux installs are probably self-installed. Probably less than 1 million computers a year are sold preinstalled with Linux.

Looking at unsubstantiated stats, Ubuntu (and its derivatives) alone are being reported as having around 100 million unique ip downloads since its inception in 2005 or so. If 1 out of 10 of those is an actual ubuntu user, that's still 10 million users. But even if it's 100 million, that's still out of 1 billion PCs worldwide, so market share between 1% to 10%.

Based on browsing statistics, Ubuntu can be extrapolated to have about 2-3% market share, with the rest of Linux distros making up <=1%. So basically, around what you said.

The last report out of Ubuntu, is 8 million active installs (using the update manager). So going by that, you're probably right about all of Linux having 2% market share worldwide, meaning it just passed Windows 2000 in usage.

In 25 years, I've never SEEN a Linux or Unix desktop PC (other than my own). But I've seen lots of Macs.

Having just graduated college, I've seen about equal numbers of Macs and Linux installs (almost all Ubuntu) with individual users. Among our computer labs, there's roughly equal Unix/Mac share as the engineering school had at least one Solaris lab, CS had an Gentoo install iirc, and Astronomy had a couple Redhat labs. Besides that, Linux was a favorite for the cluster computing projects that were always popping up.
 
Just because X11 is a de-facto standard for *nix systems doesn't mean it's a good one. Much of what's in that old Ars article still applies today. The fastest way to bring X11 up to Quartz would have been to trash it, which would have accomplished the same thing. Certainly trying to shoehorn everything Quartz can do in to X11 would have taken many years longer.

Then don't put together ads calling the OS a "64-bit UNIX workstation" when it's neither 64-bit or unix.

They use VNC for remote control even though it sucks, what happened there? They used UFS and HFS even though both are pretty bad, what happened there? If they didn't consider it too much work to implement Quartz instead of fixing X why didn't they come up with new, better filesystems and a remote display protocol for Quartz?

Your complaints strike me as being less about the fact that Apple didn't use X11, and more about the fact that they didn't open source the entire OS and give it away. And that's an entirely different argument.

Not so much, although I obviously would've preferred an actually free system. The main problem is that they keep saying "OMG its so easy because it's a Mac and it's so powerful because it's UNIX (except here and here and here...)".
 
Originally posted by: Nothinman
Then don't put together ads calling the OS a "64-bit UNIX workstation" when it's neither 64-bit or unix.

They use VNC for remote control even though it sucks, what happened there? They used UFS and HFS even though both are pretty bad, what happened there? If they didn't consider it too much work to implement Quartz instead of fixing X why didn't they come up with new, better filesystems and a remote display protocol for Quartz?
Correct me if I'm wrong, but as far as I know there's nothing in the Unix standard about X11. You don't need X to be Unix. So it's most assuredly Unix, and The Open Group (the guys who control the standard) seem to agree.

You do have a point on 64-bit though. It has 64-bit abilities, but the lack of a real 64-bit kernel is off-putting.

As for those other things, Apple could probably make something better, but with items like those, what's the point? VNC for example - they don't do terminal services. VNC is only used sparingly for Apple Remote Desktop. Something better would be nice, but probably not a good use of time/resources. As for HFS+, what's wrong with it? Granted in 2001 it didn't have journaling or ACLs, but then again at that point home computers were still using FAT32 and Linux was on EXT2.
 
Correct me if I'm wrong, but as far as I know there's nothing in the Unix standard about X11. You don't need X to be Unix. So it's most assuredly Unix, and The Open Group (the guys who control the standard) seem to agree.

Just like a car without brakes is still car. Hell, IIRC years ago NT4 qualified for some level of unix certification because it had a working posix subsystem.

Sadly the "UNIX" trademark that you get from the Open Group is just something that you buy and isn't really worth anything. They're about as relevant as the XFree86 people these days.

You do have a point on 64-bit though. It has 64-bit abilities, but the lack of a real 64-bit kernel is off-putting.

Actually the kernel has to be 64-bit in order to run 64-bit binaries. The real problem is that initially, when they were doing the "OMG 64-bit workstation!" ads it was just the kernel and some base libraries so the system could use all of your memory but you couldn't run any worthwhile 64-bit apps. It was just with the last release or so that all (most?) of the core system was finally brought up to 64-bit.

As for those other things, Apple could probably make something better, but with items like those, what's the point? VNC for example - they don't do terminal services. VNC is only used sparingly for Apple Remote Desktop. Something better would be nice, but probably not a good use of time/resources.

But lots of people remote control their home PCs. I RDP into my work Windows machine all of the time, but if it was VNC I'd never even since it's bad even on a LAN let alone trying to use over the Internet. And along those same lines what was the point of Quartz? X11 worked and still works. Back in the late 90s when they started this there wasn't direct rendering for OpenGL but it's not like Macs are a big gaming platform.

As for HFS+, what's wrong with it? Granted in 2001 it didn't have journaling or ACLs, but then again at that point home computers were still using FAT32 and Linux was on EXT2.

You just named 2 fairly major things wrong with it, ACLs less since Macs are usually single-user though. And according to wikipedia ext3 was started in 1998 and actually merged with the mainline kernel in 2001. And MS was shipping NT3.1 with NTFS back in 1993.

The point is just that in so many ways they used what was "good enough" from the previous system and added on as they went, but for some reason decided X wasn't worth their time.
 
Originally posted by: Nothinman
Just like a car without brakes is still car. Hell, IIRC years ago NT4 qualified for some level of unix certification because it had a working posix subsystem.

Sadly the "UNIX" trademark that you get from the Open Group is just something that you buy and isn't really worth anything. They're about as relevant as the XFree86 people these days.
As far as I know NT4 was never Unix certified. It was POSIX certified however(albeit an old version, and without access to the Win32 APIs).

I don't think the UNIX trademark is as useless as you consider it though. If it was just a matter of money, there would be more than a handful of UNIX'03 certified systems out there. Meanwhile everyone else but Microsoft chases UNIX, even if they can't get certified.

And along those same lines what was the point of Quartz? X11 worked and still works. Back in the late 90s when they started this there wasn't direct rendering for OpenGL but it's not like Macs are a big gaming platform.
Didn't we just go over this? The point of Quartz was to be a better windowing system than X11. And bear in mind that OpenGL was a big part of that - Mac OS X 10.2 was released a little more than a year after 10.0, at which point it added Quartz Extreme for GPU compositing, for which OpenGL would have been essential. Quartz Extreme was in the plan for a long time, even if it wasn't ready until 2002.

You just named 2 fairly major things wrong with it, ACLs less since Macs are usually single-user though. And according to wikipedia ext3 was started in 1998 and actually merged with the mainline kernel in 2001. And MS was shipping NT3.1 with NTFS back in 1993.
This of course was all back in March of 2001; ext3 didn't launch until November, according to ye' olde Wikipedia. They wouldn't have been able to launch with ext3 even if they had wanted to go that direction. And NTFS only got journaling back in 2000. Instead they added journaling to 10.2 (though I guess they could have copied MS and done it for 10.0 if need-be), I'm not immediately sure when ACLs were added to Mac OS X though(or Linux for that matter).

The point is just that in so many ways they used what was "good enough" from the previous system and added on as they went, but for some reason decided X wasn't worth their time.
It strikes me that if they take bits that are "good enough" and they don't take X, then X probably isn't "good enough". If they just wanted to break compatibility like you attest to, then they could have easily made an X knock-off with enough changes to break compatibility without creating so much work for themselves.

We can go over this again and again, but ultimately X was in bad shape in 1999 when they had NextStep's Display Postscript to build a successor from. The only other thing I can do is point out this excellent Slashdot post from 2003. It's Mike Paquette, the designer of Quartz, explaining why they didn't use X.
 
I don't think the UNIX trademark is as useless as you consider it though. If it was just a matter of money, there would be more than a handful of UNIX'03 certified systems out there. Meanwhile everyone else but Microsoft chases UNIX, even if they can't get certified.

I don't know of a single person who considers the trademark or certification worth anything. If it's for something big like Oracle they care about support so they either use Solaris or Linux, mostly the latter these days, and if it's something free then they don't even care about that.

Didn't we just go over this? The point of Quartz was to be a better windowing system than X11. And bear in mind that OpenGL was a big part of that - Mac OS X 10.2 was released a little more than a year after 10.0, at which point it added Quartz Extreme for GPU compositing, for which OpenGL would have been essential. Quartz Extreme was in the plan for a long time, even if it wasn't ready until 2002.

Maybe it was just bad timing then but I still think it would've been better to extend/fix X rather than starting over.

This of course was all back in March of 2001; ext3 didn't launch until November, according to ye' olde Wikipedia. They wouldn't have been able to launch with ext3 even if they had wanted to go that direction. And NTFS only got journaling back in 2000. Instead they added journaling to 10.2 (though I guess they could have copied MS and done it for 10.0 if need-be), I'm not immediately sure when ACLs were added to Mac OS X though(or Linux for that matter).

I doubt they could've used ext3 unless they chose Linux since they would've had to port it to Mach. But the main point wasn't that they should've used ext3 or NTFS but just that for some reason UFS and HFS+ were "good enough" even though they suck pretty bad.

We can go over this again and again, but ultimately X was in bad shape in 1999 when they had NextStep's Display Postscript to build a successor from. The only other thing I can do is point out this excellent Slashdot post from 2003. It's Mike Paquette, the designer of Quartz, explaining why they didn't use X.

And I guess I'll just have to conceded that I don't get the point. Sure they have a feature list that sounds really impressive and they did have GPU offloading first but I use X every day and it's fine. The times I have used a Mac I never noticed anything outstanding about the UI other than the annoyances from things being different than what I'm used to.
 
Back
Top