Linux going mainstream?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sharq

Senior member
Mar 11, 2003
507
0
0
Plus, the argument that there are no spyware or viruses for Linux is only temporary. Once it goes "mainstream" (as everyone seems to want) there will be just as many problems with it.

This I disagree with and even if stupid things like gator do get ported and find a way to trick Mozilla into installing them behind the user's back, they'll be a lot easier to remove and track down.

I believe the reason that alot of these get through is also the stupidity on the part of the user. I dual boot and I have not had a single piece of spyware in all my years using Windows.

Linux is a great OS and all, but there are just as many flaws to it as Windows

Right, when was the last remote root exploit for Linux? How about for Windows? MS had to completely redo their RPC system because it was so poorly written the first time.

Not every exploit is brand new. There are alot of linux machines that aren't configured right or have the proper patches and are susceptible (sp?) to old exploits.
As I said, Linux is great, much better than the competition, but the problem boils down to the user. If "grandma" doesn't apply the necessary patches cause she thinks: "who will want to bother with my computer" (as sooooo many Windows users think) then we will see more stories about Linux.


I can't remember the last patch that actually affected something I use on any of my Linux machines. Part of that is because I run Debian so pretty much all the updates are seamless and 'just work', but I'm on the Debian-security list and updates for things like pavuk, rlpr, www-sql, etc just don't matter for desktop users.

Well I don't know enough about Debian to comment. But it's safe to say I wasn't talking about someone who is on top of their security needs. :)
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
I believe the reason that alot of these get through is also the stupidity on the part of the user. I dual boot and I have not had a single piece of spyware in all my years using Windows.

Even with things like http://isc.incidents.org/diary.php?date=2004-06-29 coupled with http://zdnet.com.com/2100-1105_2-5247187.html?tag=zdfd.newsfeed ?

Not every exploit is brand new. There are alot of linux machines that aren't configured right or have the proper patches and are susceptible (sp?) to old exploits.

Because all the Linux worms cause big problems like CodeRed and Slammer did. I still get CodeRed variant attempts in my web server log and port 1434 connection attempts on my firewall from infected machines, I have yet to see something that could be attributed to any single Linux or unix daemon (say bind, sendmail, etc) exploit.

If "grandma" doesn't apply the necessary patches cause she thinks: "who will want to bother with my computer" (as sooooo many Windows users think) then we will see more stories about Linux.

I doubt it. A lot of people are trying Linux these days and there's no hint of trouble from them in any significant numbers. I'm not saying things are perfect and I know people will have problems eventually but I doubt the problems will ever be as big as the ones caused by Windows even if the install base is the same or greater.

Well I don't know enough about Debian to comment. But it's safe to say I wasn't talking about someone who is on top of their security needs.

It's not even me being on top of my security needs, I really usually run the updater to see what's new in the archives as new packages are being added all the time. Since the end of may I see 17 debian-security publishings for patches and only 4 of them could even remotely affect me and that's only because I run apache (but in this case mod_proxy was the real affected package which I don't use), webmin and occasionally use cvs (but they were pserver, not client problems) to get sources for the rare things that aren't packaged in Debian sid.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
The big deal with the adoption of Linux on the desktop is the compitition.

It's a very bad thing that Windows has 90-95% dominance over the desktop market. Lots of companies have been put out of business and lots of alternative OS's that would of had a chance otherwise got squished.

Look up Beos for a good example.


Also Linux plays well with others, but MS doesn't. 90% of the software you use on Linux gets easily ported to any other OS. Linux sticks to standards, programs created to work on Linux have a high degree of portability in general.

Compitition is good it drives innovation and ultimately all computer users would benifit, even those using Windows.

Also freedom is very important. Think about this:

A purpose of a operating system isn't realy to "run" the computer. It's job is to create a enviroment were it's easy for programmers to create programs. Your browser, for instance, doesn't technically NEED a operating system. It could be on a boot CD, and you boot up from that and the browser can take care of everything from display drivers, to mouse controls, to rendering the html code on the screen.

Of course this is absurd, but you have to realise that the true purpose of the OS is to lend functionality to programs that gets used on it. So basicly each and every program your running has OS source code running thru it. In your RAM there, little bits and peices of your OS are getting added and removed from memory, tied directly into the little bits and peices that makes up your programs your using.

So even though we make distinctions in our minds, and legal liscences, there are no such distinctions to the computer. To your computer it simply is a single very long line of 1's and 0's that it systematicly reads one after another. It doesn't care weither or not the zeros came from some .DLL or your program.

So technically your OS + your programs = your actual program you use.

So by having the OS completely closed, it's counter productive to good software and good software design. How are you suppose to know how a program works and how to write a good program if the major part of it is off-limits from you?

And MS has used this clout to take out competitors and eat other software companies. If you don't play ball, then your programs are not going to work. MS dictates the liscencing, and decides on how much of the code they are going to expose to you.

So it's obvious that legally nobody is going to do anything about their monopoly (not that I want them too), Linux is the next best hope in turning MS into "just another software company" instead of "THE software company", which is currently the direction things are heading.

At least Bell was a "beneviolent" monopoly. They were under tight control and their bad behavior was kept to a mild roar. MS on the other hand is hyper agressive/competative and doesn't worry to much about throwing it's weight around.

If I saw that MS Windows was 60-70% of the software market, then I would be happy. Then I would feel confident a that people would have a real choice.
 

groovin

Senior member
Jul 24, 2001
857
0
0
i guess grandma was a bad example... computer users in general would probaly have been better.. , as mentioned, installing new stuff in linux is where the problem would be. the average comp user probaly has installed software before (AIM, kazaa, etc).

but i think that apt-get <app name> or emerge <app name> is much easier than sitting through 20 different windows while clicking 'next'. you just have to wait 5 hours for OOo to compile =)
 

silverpig

Lifer
Jul 29, 2001
27,703
12
81
Originally posted by: Nothinman
gentoo's portage rocks </bias>

Yea, because waiting 5 hours for Oo_O to compile is fun.

Emerge openoffice-bin. You CAN get precompiled binaries for most of the larger apps you know. And gentoo's portage servers are screaming fast. I can hit 600 kB/s off of them, 300 on a bad day.

One command and I've got OO and all it's dependencies rocketing in at 600 kB/s, installed and configured, with no compiling to do at all (unless you count the dependencies).
 

crazycarl

Senior member
Jun 8, 2004
548
0
0
I personally would love to go to linux. but the driver installations looked like a nightmare, so I'm not gonna mess w/ it, until Ati gets it stuff together to let me easily install my video drivers, and Nvidia so I can easily install mobo drivers...
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Emerge openoffice-bin. You CAN get precompiled binaries for most of the larger apps you know. And gentoo's portage servers are screaming fast. I can hit 600 kB/s off of them, 300 on a bad day.

One command and I've got OO and all it's dependencies rocketing in at 600 kB/s, installed and configured, with no compiling to do at all (unless you count the dependencies).

I know, I just don't care for portage. I used to be able to hit 600K/s from Debian's mirrors until Comcast capped the downstream at ~300K/s and I with Debian I would avoid all the compiling of dependencies and get a little QA on my packages.

I personally would love to go to linux. but the driver installations looked like a nightmare, so I'm not gonna mess w/ it, until Ati gets it stuff together to let me easily install my video drivers, and Nvidia so I can easily install mobo drivers...

I don't know about nVidia's motherboard, NIC, etc drivers but the video drivers are simple to install. ATI's crap looked like crap, I won't be buying anything of theirs until they hire some people who have a clue about writing drivers.
 

silverpig

Lifer
Jul 29, 2001
27,703
12
81
Originally posted by: Nothinman
Emerge openoffice-bin. You CAN get precompiled binaries for most of the larger apps you know. And gentoo's portage servers are screaming fast. I can hit 600 kB/s off of them, 300 on a bad day.

One command and I've got OO and all it's dependencies rocketing in at 600 kB/s, installed and configured, with no compiling to do at all (unless you count the dependencies).

I know, I just don't care for portage. I used to be able to hit 600K/s from Debian's mirrors until Comcast capped the downstream at ~300K/s and I with Debian I would avoid all the compiling of dependencies and get a little QA on my packages.

I personally would love to go to linux. but the driver installations looked like a nightmare, so I'm not gonna mess w/ it, until Ati gets it stuff together to let me easily install my video drivers, and Nvidia so I can easily install mobo drivers...

I don't know about nVidia's motherboard, NIC, etc drivers but the video drivers are simple to install. ATI's crap looked like crap, I won't be buying anything of theirs until they hire some people who have a clue about writing drivers.

I don't really see what the big difference between portage and apt-get is. I use portage and love it :)

I've got ati's drivers working under gentoo (thanks portage as emerge ati-drivers &amp;&amp; fglrxconfig &amp;&amp; opengl-update ati and a <ctrl> <alt> <backspace> had them going), but yeah, they are crap. UT2004 is playable, but not at the same resolutions as in windows. Good news though: ATI has hired a whole ton of new linux driver guys and are investing heavily in drivers across the board. I heard about this a few months ago, and they said we should see the results 6 months from then, so maybe fall?

Oh, and I haven't been able to get nVidia's mobo drivers to work worth a damn (cause there aren't really any). The nforce controlled onboard nic sits disabled. Thank god for the other onboard one being a 3com 3c59x chip as there's a driver in the kernel for it. Soundstorm is crippled too. I can get sound, but it's been a battle. No hardware mixing either, so it's one sound source and that's it. Nvidia makes excellent linux video drivers, but are absolute crap for everything else.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
I don't really see what the big difference between portage and apt-get is

It's not really the difference between portage and apt but the difference between Debian and Gentoo.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Originally posted by: crazycarl
I personally would love to go to linux. but the driver installations looked like a nightmare, so I'm not gonna mess w/ it, until Ati gets it stuff together to let me easily install my video drivers, and Nvidia so I can easily install mobo drivers...


Just buy smart hardware. Most decent stuff is well supported. The nvidia drivers are irritating, but not horrible to install. They come with good directions, you just have to read thru the text file they supply a couple times and it's not difficult.

The worst part is editing the /etc/X11/xorg.conf or XF86Config and changing Driver "nv" to Driver "nvidia" in most cases. Nvidia has support forums were you can search and ask and get answers very quickly to your problems.

Via or Intel-based motherboards usually are fine. Sound Blaster or Audigy sound. Most on-board stuff works fine. Nvidia motheboards have the chicken and the egg problem with you need to get to the internet to download the drivers, but you need to get the nic card to work if you want to get on the internet, but you need the drivers to get on the internet in the first place. That sort of thing.

The other major sticking point is wireless drivers.

Good buying desicions will make any linux install a breeze, literally. It's very easy if you have properly supported hardware, you just slap the cdrom in there, press enter a dozen or so times, and your in like flint.

Actually then it's much easier then Windows, because the installers are usually much more pleasent to work with and you never have to deal with the activation/key code BS.

alsa sound card matrix, may still have some issues with audigy2's, but I don't own one to be sure.

prism54 wireless drivers. Seems to me the best wireless support for Linux. 802.11g cards, relatively inexpensive and supported by default with the newest kernels. The only catching point is the firmware, you need to have something called "hotplug" installed to load the firmware to run the cards. I like them.

information on wireless cards in generall

suggested printers for linux

Otherwise it's not that difficult. A little bit of research before buying hardware can make it very easy to install linux and dealing with drivers. Once installed and working the likelyhood of having drivers issues (except for nvidia;s propriatory stuff, that's about the same as in windows) is very small.
 

crazycarl

Senior member
Jun 8, 2004
548
0
0
yea, had i really cared about it at the time i bought the system i probably would have done a little more research, but hey i was effectively a n00b as i hadn't bought a pc in 5 years...... oh well. i hope thigns do get better though - that story about ati drivers looked promising
 

Spencer278

Diamond Member
Oct 11, 2002
3,637
0
0
Originally posted by: crazycarl
yea, had i really cared about it at the time i bought the system i probably would have done a little more research, but hey i was effectively a n00b as i hadn't bought a pc in 5 years...... oh well. i hope thigns do get better though - that story about ati drivers looked promising

I will be surprised if ATI drivers ever install correctly in windows no less if they get them to work in linux.
 

crazycarl

Senior member
Jun 8, 2004
548
0
0
Originally posted by: Spencer278
Originally posted by: crazycarl
yea, had i really cared about it at the time i bought the system i probably would have done a little more research, but hey i was effectively a n00b as i hadn't bought a pc in 5 years...... oh well. i hope thigns do get better though - that story about ati drivers looked promising

I will be surprised if ATI drivers ever install correctly in windows no less if they get them to work in linux.


no they work fine in windows never had a problem at least :D
 

civad

Golden Member
May 30, 2001
1,397
0
0
Apt gui frontend .. not necessarily on steroids..

XandrosNetworks :)

I love it when I get unmet dependencies when I use debian testing in my sources.list; let alone unstable.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
I love it when I get unmet dependencies when I use debian testing in my sources.list; let alone unstable.

I've only seen that happen when a package transition was being made and even then it only lasts a few days or maybe weeks at most as all the packages get recompiled on the buildd boxes.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Testing isn't realy suppose to even be used so much for regular computing anyways.


The point of testing is to provide a testing ground for developers to use for the next creation of the OS. Often testing, traditionally, isn't even a complete and totally functional OS, on purpose. Right now it is very complete because the next version of debian stable is almost ready, but after the change over happens I am afraid that many testing users are going to be in a world of hurt.

Testing is fine and dandy, but it's a mistake to think that testing is less-unstable then unstable.

Another bad thing about using testing is that testing is the last debian branch to get security updates, because generally they have to filter thru unstable first. Stable gets backports and security updates pushed thru first, that's the highest priority. Unstable usually gets them very quickly too, but for testing that's not the priority.

The good thing about testing though, of course, is that you get packages that were already proven in unstable. And it doesn't change as often so the updates aren't usually as gigantic as they are in unstable.


If you want to run testing, generally you have to add unstable sources, too. Then use the /etc/apt/preferences file to basicly say "install from testing by default, and if not present use packages from unstable".

That way you can run testing, and if you reach a point were you would have unfufilled dependances, or a recent update to testing broke a bunch of dependancies then your need will be fuffilled by what is aviable in unstable.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
If testing isn't meant to be used, why is the order typically: stable, testing, unstable?
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: n0cmonkey
If testing isn't meant to be used, why is the order typically: stable, testing, unstable?

Because drag got it backwards ;). (I thought he was right, but some googling implies otherwise).
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: CTho9305
Originally posted by: n0cmonkey
If testing isn't meant to be used, why is the order typically: stable, testing, unstable?

Because drag got it backwards ;). (I thought he was right, but some googling implies otherwise).

:confused:

"if you use debian heres a warning (atleast if you enter the debian channel): dont come in as root, you never hear the end of it, which was very annoying."

HAHAHAHA :p
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
I am not completely sure, but I am pretty sure.

One thing I do know for certian though is that testing is usually the last version to get security bugs fixed. Sometimes security fixes can lag weeks behind stable or unstable.

I could definately wrong about using testing vs unstable, but I think that it is a very good idea to include sources from testing and unstable at the same time and then set the preferences to give testing packages priority.

Otherwise there is no garrantee that all the packages that you'll need will be present.


Actually now that I think about it, it may be best to stick to testing and do the preferences thing with unstable. Not sure.