And you thought linux wasn't ready for the desktop

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: hypersonic5
Originally posted by: n0cmonkey
Originally posted by: hypersonic5
Originally posted by: kylef
As far as Linux on the desktop: I think the biggest thing holding back Linux support on the desktop is device support, plain and simple.

Thats what's holding me back :p

You're holding you back.

Yes, because obviously it's my fault Linux is a bitch to get to recognize my hardware.

To explain my point a bit further, I'M NOT PSYCHIC. In your thread, you mention wireless, SuSE 9.1, and what else? The fact it isn't working. No hardware list and minimal information on what you've tried. Doing the best I can. :)

The truth is, if a fairly experienced computer user like me can't get Linux to work right, do you think that a total noob who has never used a computer before will adopt Linux?

They have no basis of compaprison, so why would they dismiss Linux?

I'm planning on installing SuSE 9.1 in a couple days (I think I get a day off this week :Q). I'll make the extra effort and throw in a wireless card (I was planning on doing this anyhow, since I've never done wireless on Linux). Can't offer much more before then. Well, no more than searching google.com/linux and posting links. ;)
 

kylef

Golden Member
Jan 25, 2000
1,430
0
0
Originally posted by: n0cmonkey
Fixing hardware in software is horrible.
It's not horrible! What's the alternative? Tell people their device is broken and can't be fixed?

According to insider reports, there were over 100 bugs in the Geforce nv350 gpu. These were all fixed, worked around, or otherwise addressed in drivers. I cite this not to single out nVidia, but to show you how essential this practice is to modern device vendors. It is simply not feasible to fix these bugs in hardware until the next revision of the silicon masks come out. In many cases, devices are becoming more and more like general pupose computers themselves, and "hardware" bugs are no more than software bugs.

Besides, this practice is as tried and true in the computer world as microcode itself. You used to be able to program the microcode of older CPUs (in the late 70s, early 80s) to fix all sorts of issues.
Plus, they could release firmware updates and have people flash the things.
End-user firmware updates for PC peripherals is "old and busted." There is absolutely no reason why users should be expected to track both firmware AND driver updates for a single device. It's hard enough to get people to update drivers when a bug is found. But flashing firmware? Fuhgetaboudit. But if both are rolled into the driver, and the driver is available on Windows Update, it's remarkably easy...

Intel wants to be nice, they should get some support from the community. Other vendors don't want to play, they won't get the kind of support Intel does.
This is the kind of unprofessional "you're either with us or against us" attitude that makes device vendors wary of open source. I know because I've dealt with many.
The solution is to support good hardware vendors with cash.
In a commodity market, how will the mere 1% of PC users running Desktop Linux be able to persuade the vendors with the lure of money?

No one has to release code. The documentation would be good enough. There cannot be non-free code in the Linux kernel. Period. The end.

Again: if I were Mr. Device Vendor with >$10 million of IP invested in my device architecture, what's the likelihood that I'm going to give the world a blueprint of my device detailed enough to write a driver that interacts with this architecture extensively? I'll tell you this, it's not high. And that's why vendor-endorsed open source graphics drivers are quickly disappearing. ATI now only supports 2d in the XFree86 project on its 9000-series cards, for instance. If you want 3d, you use ATI's closed-source drivers (which I'm actually surprised that they update as frequently as they do).

And if you want to use binary drivers, don't ask the Linux guys for help. They won't.
Well, then the "no linux support" situation is mutual, which I find stupid on the part of the Linux community. It should be the Linux guys begging vendors for drivers and jumping at the opportunity to "help" them by providing as much support for what vendors want as possible. Otherwise, you can say goodbye to a lot of device vendor support. And this WILL hurt, no matter how much you disparage them by making fun of where their devices are sold and how well their drivers work in Windows.

And please don't take these opinions personally. I have simply laid out from my experience why I think the linux driver situation is not vendor-friendly. :)
 

MournSanity

Diamond Member
Feb 24, 2002
3,126
0
0
[
I'm planning on installing SuSE 9.1 in a couple days (I think I get a day off this week :Q). I'll make the extra effort and throw in a wireless card (I was planning on doing this anyhow, since I've never done wireless on Linux). Can't offer much more before then. Well, no more than searching google.com/linux and posting links. ;)

Tell me how that goes ;)
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: hypersonic5
[
I'm planning on installing SuSE 9.1 in a couple days (I think I get a day off this week :Q). I'll make the extra effort and throw in a wireless card (I was planning on doing this anyhow, since I've never done wireless on Linux). Can't offer much more before then. Well, no more than searching google.com/linux and posting links. ;)

Tell me how that goes ;)

Will do. If it offers any insights, I'll definitely post them. :)
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: kylef
Originally posted by: n0cmonkey
Fixing hardware in software is horrible.
It's not horrible! What's the alternative? Tell people their device is broken and can't be fixed?

As I said, firmware upgrades. You mention that nVidia fixes things in the driver, that sounds like a fine solution too.

According to insider reports, there were over 100 bugs in the Geforce nv350 gpu. These were all fixed, worked around, or otherwise addressed in drivers. I cite this not to single out nVidia, but to show you how essential this practice is to modern device vendors. It is simply not feasible to fix these bugs in hardware until the next revision of the silicon masks come out. In many cases, devices are becoming more and more like general pupose computers themselves, and "hardware" bugs are no more than software bugs.

Besides, this practice is as tried and true in the computer world as microcode itself. You used to be able to program the microcode of older CPUs (in the late 70s, early 80s) to fix all sorts of issues.
Plus, they could release firmware updates and have people flash the things.
End-user firmware updates for PC peripherals is "old and busted." There is absolutely no reason why users should be expected to track both firmware AND driver updates for a single device. It's hard enough to get people to update drivers when a bug is found. But flashing firmware? Fuhgetaboudit. But if both are rolled into the driver, and the driver is available on Windows Update, it's remarkably easy...

We aren't talking about Windows users here.

Intel wants to be nice, they should get some support from the community. Other vendors don't want to play, they won't get the kind of support Intel does.
This is the kind of unprofessional "you're either with us or against us" attitude that makes device vendors wary of open source. I know because I've dealt with many.

There is no choice here. Play nice, or don't play at all.

The solution is to support good hardware vendors with cash.
In a commodity market, how will the mere 1% of PC users running Desktop Linux be able to persuade the vendors with the lure of money?

They won't, but the people running it on their servers might be able to back up the important stuff.

No one has to release code. The documentation would be good enough. There cannot be non-free code in the Linux kernel. Period. The end.

Again: if I were Mr. Device Vendor with >$10 million of IP invested in my device architecture, what's the likelihood that I'm going to give the world a blueprint of my device detailed enough to write a driver that interacts with this architecture extensively? I'll tell you this, it's not high. And that's why vendor-endorsed open source graphics drivers are quickly disappearing. ATI now only supports 2d in the XFree86 project on its 9000-series cards, for instance. If you want 3d, you use ATI's closed-source drivers (which I'm actually surprised that they update as frequently as they do).

This is where patents come into play, not to mention copyrights.

And if you want to use binary drivers, don't ask the Linux guys for help. They won't.
Well, then the "no linux support" situation is mutual, which I find stupid on the part of the Linux community.

How are Linux developers going to help fix a problem if they cannot see the code? It could be in the kernel proper, or it could be this piece of binary junk the user downloaded from nVidia, or this one downloaded from VIA, or this one downloaded from RealTek, or this one... If all of the drivers are Free, this isn't as much of a problem. The developers can do what they need to do without having to wonder if one of those lumps of goo is breaking something.

It should be the Linux guys begging vendors for drivers and jumping at the opportunity to "help" them by providing as much support for what vendors want as possible.

They do. The kernel sources are there, their email addresses aren't private, the mailing lists are available to all, and they will accept all documentation they can get their hands on to write proper Free drivers. Hell, there have even been reports of Linux developers signing NDAs (which I think is stupid, but the developers of my OS of choice agree with me, so it's ok :)).

Otherwise, you can say goodbye to a lot of device vendor support. And this WILL hurt, no matter how much you disparage them by making fun of where their devices are sold and how well their drivers work in Windows.

Huh? There are vendors that are FOSS friendly. VIA, AMD, Intel (on occassion, they provide BSD licensed code but not docs: WTF?! :confused: :p ). That's a computer RIGHT THERE. ;)

And please don't take these opinions personally. I have simply laid out from my experience why I think the linux driver situation is not vendor-friendly. :)

If I seem like I'm taking something personally it's because I'm tired, and cranky, and work too much. I'll forget all about it when I get sleep, and I'm not trying to be mean or anything to you or others. I've held back in my responses, believe it or not. ;)
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
It cuts hardware development time and permits hardware 'firmware' fixes via driver updates.

The same thing could be accomplished by storing the the firmware in flashrom and only updating it when the drivers are updated, you could bundle them together and not even have the user realize they updated both at the same time. They already do it like that, so why not have the firmware update stick instead of having it lost on a reset?

Isn't that sort of like treating the symptom rather than treating the problem? Last time I checked, the vast majority of laptops out there were not Centrino-based. And with laptops now comprising more than 50% of all system sales, this headache will only get worse for linux until a better solution is found.

For the most part the problem can't be fixed, the problem is hardware companies that won't give out specs so that OSS drivers can be written. There are other supported wifi chipsets, I just mentioned the Intel stuff because it seems like Intel is actually trying.

The Linux community should embrace closed-source drivers in a big way.

Won't happen, it sort of defeats the purpose of having a GPL'd OS if all the drivers are closed source. Most of the kernel developers, including Linus, are adamantly against binary-only drivers and actively work to discourage them and I agree with them.

You can use WMI to get just about everything that ACPI exposes. If no tool exists, you can write a VBScript command shell script to get it in about 10 lines of code. But I'm not sure exactly what ACPI quantity you're referring to.

I probably don't know the correct name, I just know that when I cat /proc/acpi/battery/BAT0/state it always says "present rate: 0 mW" for either discharging or charging and I'm pretty sure that's not correct and I wanted to verify if Windows got the same number to determine if it was a Linux ACPI problem or a BIOS ACPI problem. Luckily I've avoided VB/VBScript for the past 5 years or so, so I won't be writing any WMI VBScripts.

Ah, I didn't know about it. Linus was always so adamantly opposed to kernel debuggers that I never thought it would happen... I don't know much about it (and I'm prohibited from playing with it now)... what kind of host/target setups does it support? Is it fully aware of the kernel data structures? And does it support live kernel debugging (single-system)?

Ever used gdb? It's pretty much the same thing only compiled into the kernel, AFAIK most people use it via a serial console but I dont' know too much about it. All I've ever used it for was to capture back traces.

Nah, I don't buy that. That's like telling people writing regular user-mode apps, "You don't need a debugger: you have your source code!" After all, the point of a kernel debugger is not to help write the kernel: it's to help driver vendors write device drivers... Linus used to call a kernel debugger an "unnecessary crutch", but that's exactly why people use debuggers: to help them solve tricky problems.

Noone's selling anything, some of the devs use them and some don't and they all seem to get their respective jobs done.

And all without forcing device vendors to open source anything.

Which I don't see as any kind of positive point.

In a commodity market, how will the mere 1% of PC users running Desktop Linux be able to persuade the vendors with the lure of money?

Obviously there some sort of pull in the OSS market or there wouldn't bee so much traffic on lkml from users with dell.com, intel.com, ibm.com, oracle.com etc email addresses.

Again: if I were Mr. Device Vendor with >$10 million of IP invested in my device architecture, what's the likelihood that I'm going to give the world a blueprint of my device detailed enough to write a driver that interacts with this architecture extensively?

If your hardware interfaces were abstract enough that you weren't giving away any secrets what's the problem? I mean what's the problem with telling everyone in the world that this is where you poke to transmit a packet, this is where you poke to recieve one, this is where you poke to set the channel, etc? You might actually enable someone else in the world to write a driver and make it so more people could use the hardware, wouldn't want that now would we?
 

drag

Elite Member
Jul 4, 2002
8,708
0
0


Linux IS ready for the desktop. It works fine for all sorts of purposes.

You can surf. You can edit photos, you can hook up your camera. You can get wireless going. You can setup a music studio.

Anything you want.

Stuff like wireless support is not up to the people that develop the the kernel.

The people that do build desktops have a wide amount of devices and hardware to choose from.

I have a 802.11g wireless card. All I have to do is copy the firmware from a website into my /usr/lib/firmware directory and it's running.

Pluss it's probably better wireless card then what you have. Possibly cheaper too. Why? Because hardware that is properly supported under linux is usually better quality then stuff that doesn't work in LInux easily. For example it can support ad-hoc, managed, and master modes. Lots of cards can't even do that.

if you want ACPI that works, you can have it. If you want a wireless card that is easy to setup, you can have it. If you want a video card that is supported automaticly, you can have it. SATA? No problem.

Plus WinXP may be a improvement over stability then Win9x, but this is something that Linux had from the beginning. It had proper multiuser support from the beginning, it has had proper security from the beginning. This is still stuff that MS still can't get right.

If the only ruler your going to compare OSes by is Microsoft's Windows then Linux will never be ready. This is because Linux ISN'T windows, and will never be as good at being Windows as Windows is.

But if you try to measure Windows by using Linux as the ruler, guess what? Windows looks worse then when you compare Linux using Windows as the standard.

Standard developement tools aviable from default installs. Networking is 100x more capable then WinXP. Stability is better. Security is much much better. Software quality is generally better.

Gimp is a decent example of this, so is Firefox's Gecko rendering engine. Gecko is superior to IE in almost every way possible. Gimp is better then MS Paint. It may not be better then Photoshop, but the difference is that Gimp costs 0 and Photoshop is 300 dollars. It's certainly better then the crap that you get free when you buy a scanner.

Windows defenders go on and on about how much easier it is to get hardware support in Windows. How much easier it is to install Windows then Linux.

But thats' pointless to even discuss to the vast majority of people. If you go and buy LInux desktop machines, everything is installed. Everything is working. Everything you'd want you can have, you can't just buy random POS hardware and expect it to work with no effort.

For example look at HP's linux offering

Everything works. ACPI works, wireless works, video card works. Every peice of equipment HP sells with that laptop that they sell along with Suse, works right out of the box.

THAT's why Linux is ready for the desktop. It's makes a fine desktop, and any normal end-user can operate it fairly well. There are a few nagging issues that is going to piss you off (such as printing support), but that's pretty flaming pointless to a person who would buy a Linux PC that comes with a printer, like many Windows computers do.

But beleive me, I just installed WinXP this weekend for my folks and it took me several hours of downloading, rebooting, installing software in order to get it to a usable, safe, and stable working order. If I just handed a WinXP home cd + a bunch of computer parts to my mom, then that would suck just as much as if I handed a Fedora install cdrom set + a bunch of computer parts to my mom.

For me, personally, Linux definately makes a much superior desktop enviroment then Windows.

The business desktop is the next step for Linux. Free software is already dominating the internet. A full 1/3 of the servers sold use Linux as the operating system for the server. Combine that with the BSD's and other Unix systems their are more of Unix-like operating systems out their and running important stuff then MS's servers. Also it's more impressive when you take into account that 1big unix server can do the same work that it would take dozens of Windows servers to do on x86 hardware.

For home users, MS has nothing to worry about from Linux for a long long time. Why? Because people are set in their ways, unless they have a reason to move to another OS they won't. Do they care that they are using WinXP? No. Do they have brand loyalty to Microsoft? No. If they have a WinME do they install XP over it generally? No. Would you? Yes, unless your a moron.

What else has Windows going for it? People are familar with the system, they are familar with the applications. Windows isn't easy to use, and people often attend classes in order to operate it. Many people have to attend classes to use Office correctly. Most people would have to attend classes to install Windows correctly.

They have no desire to redo that to learn linux unless they have a realy good reason, and that doesn't exist(yet). That's why your not going to see any major migration from Windows.

It has much less to do with technologic superiority vs inferiority then what we would like to beleive. We talk about stuff like that because we are technologistist. We are geeks.

After all if technological supperiority is what dictates market forces, then firefox would be used in a hell of a lot more computers then just the 10-15% share that it currently has. ;)
 

Spencer278

Diamond Member
Oct 11, 2002
3,637
0
0
Originally posted by: n0cmonkey
Originally posted by: Spencer278
I wasn't talking about windows at all. Just look at how many installers for different distros there are.

I hate the fact Microsoft and Apple use different installers.

Not a signal one is good they all suck. Sure some suck less then other but they still suck.

There is a perfect installer out there.

For example Suse chose the wroung refresh rate for my monitor,

And they knew what you wanted how?
They should have know that I didn't want a resulotion and refreess rate greater then my monitor can support. Not to mention no one would want to run a 19 inch monitor at 2014x1400 at 60hz anyways.

FC lets you do the partion and mount points before telling you what CDs are need so you have to reboot the computer if you didn't burn all the CDs. The other option is to ignore the message and then get stuck in a loop prompting for the next CD. If you reboot at that point burn the missing CDs and choose upgrade FC will not install the bootloader.

Bwahahaha! "Here's the software you need, but I wouldn't bother trying to be prepared, that would be silly." Why should it be nessicary to burn all the CDs when the can be mounted latter with out wasting a CD. But that is besides the point the software should have been designed so that I can skip a CD and it should have been made so that a bootloader can be installed without upgrading the kernal.

apt-get is grreat until the user wants a diffferent configuration for the software

Download the source and make a dpkg or whatever. Many of the programs offer varying configurations, or atleast keep things generic enough to be quite useful for plenty of people. Back to the crappy ./configure make make install.?

or until a distrobutor wants to put there software on a webpage for people to download.

They can make dpkgs too.

 

drag

Elite Member
Jul 4, 2002
8,708
0
0
X windows polls the monitor in order to find out what refresh rates and resolutions your monitor supports. Then it sets them accordingly.

It does this thru the DDC which is a Vesa standard. Basicly it's not nessicarially your OS that decides what settings your Monitor can support it's your monitor telling the OS what settings it can support.

apt-get is grreat until the user wants a diffferent configuration for the software

I don't know what you mean by that. I use apt-get and all the services I use have custom configurations, it would be kind of pointless to run a Apache server when each time you updated it it blew the configurations, isn't it?

It just installs a default config, and you edit it or replace it with whatever configuration you want. Or is Apt-get suppose to magicly read your mind and know what configurations it should set up for you. With debian the package maintaner can setup the installer to ask you questions based on the amount of interactivity you set on what sort of defaults to use.

As far as Windows goes I don't know of anything that MS does that comes close to doing what Apt-get is capable of. Sure you can set it up to patch automaticly, but it's not like your going to be able to use it to update the functionality or patch anything other then the core of the OS. I mean, can you install a web server with it's php modules thru MS's stuff, or upgrade from Office 2000 to office 2003 using it and keep your settings?


Why should it be nessicary to burn all the CDs when the can be mounted latter with out wasting a CD. But that is besides the point the software should have been designed so that I can skip a CD and it should have been made so that a bootloader can be installed without upgrading the kernal.

That doesn't make much sense either. Why would it make sense to install software from CD's using automated installer based from those CDs without having all the cd's present? If you just want to use one cdrom, then do the absolutely minimum install you can do, then download and install apt-get and then use that to install the software...

If you want to run the install from loopback cdroms on a seperate partition, you can do that if you realy realy want to. I don't see much point to it though. Personally I prefer a net-based installer and only need the cdrom to boot off of.

and the bootloader doesn't make much sense either... Don't you just tell the bootloader what kernel to use in it's configuration files? Or are you pissed that when you upgrade a kernel it does this automaticly or something?
 

pitupepito2000

Golden Member
Aug 2, 2002
1,181
0
0
Originally posted by: n0cmonkey
"I'm not using Linux until device support improves."
"I'm not writing a driver for Linux until more people use it."
"I bought Linux compatible devices so F you both."

lol :)
 

pitupepito2000

Golden Member
Aug 2, 2002
1,181
0
0
Originally posted by: hypersonic5
Originally posted by: n0cmonkey
Originally posted by: hypersonic5
Originally posted by: kylef
As far as Linux on the desktop: I think the biggest thing holding back Linux support on the desktop is device support, plain and simple.

Thats what's holding me back :p

You're holding you back.

Yes, because obviously it's my fault Linux is a bitch to get to recognize my hardware.

Look, the topic was about Linux becoming mainstream as a desktop OS. There is no way in hell it will be that if it doesn't have the device support that Windows has, not to mention the auto configuration that Windows does with 99% of the hardware out there.

The truth is, if a fairly experienced computer user like me can't get Linux to work right, do you think that a total noob who has never used a computer before will adopt Linux? I might be holding myself back, but if thats the case, regular computer users don't have a chance in hell of figuring out how to use Linux, and Linux will never replace Windows for them.

Of course they could, if somebody were to sell them a computer with linux preloaded, which probably will come with all the software that user will ever need. Most users buy computers from vendors that give them the OS installed, and then the user has to go about trying to install other software such as im, word processor, etc. Now with linux being preinstalled, a normal user won't have to install any kind of software, most distros such as Fedorca come with all the software you will ever need.
 

quentinterintino

Senior member
Jul 14, 2002
375
0
0
Suse 9.1 installed w/out a glitch - USB 2.0 works fine.

XP still cannot find working drivers for my usb 2 on my asus board...

just sayin....
 

MrChad

Lifer
Aug 22, 2001
13,507
3
81
Just wanted to chime in and thank kylef, n0c, and Nothin for some excellent, intelligent back-and-forth discussion on this topic. I don't have anything to add, but I always learn a lot from your posts. Cheers :beer::D
 

MournSanity

Diamond Member
Feb 24, 2002
3,126
0
0
Originally posted by: MrChad
Just wanted to chime in and thank kylef, n0c, and Nothin for some excellent, intelligent back-and-forth discussion on this topic. I don't have anything to add, but I always learn a lot from your posts. Cheers :beer::D

I agree. Some of the best posters on these boards :thumbsup: