Linux non-n00b's

jonmullen

Platinum Member
Jun 17, 2002
2,517
0
0
Well we are always talking about linux distro's for novices to linux, and the most frequent suggestions are Mandrake and Redhat. Personally I can't stand the above stated just b/c I hate tracking down RPM dependencies. I personally stick with debian and gentoo, but I was just wondering if there are any linux vetrans who would like to share or defend RPM's. For the sake or argument lets exclude the little hack that brings apt-get support for RPM's cuz that would kinda defeat the purpose.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Yes, Debian is the way God ment Linux to be.

Each to their own though. For example if you need to deploy a corporate network Redhat would be my first choice, because it's tested, supported, and designed for that purpose.

Debian could do that too, but it would require more work. It probably be worth it though.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Instead of hunting down RPMs install apt4rpm or yum or signup to be able to use up2date.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
Well, RPM in itself isn't too bad, it's the lack of a good(or rather, as good) frontend that makes it inferior to DEB or EBuilds.

IMO anyways.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
I think a simple failure of your assumption is that more advanced linux users really use a lot of package management software in the first place. Debian users may be lazy and rely on Apt, but personally, I like to control where everything is put, so I prefer to compile from source, and rely on binary distributions only as a last-resort crutch. I know a lot of people who are of a similar mind. If absolutely necessary I would rather rely on something like slack's keep-it-simple tarball approach. I personally don't know why you would bother with RPM or Apt. Maybe I should make a thread asking people to tell me why to use package management at all. :D

At any rate, the times I have used RPM, dependencies were never an issue, mainly because I could read and only downloaded RPMs from people who obviously targeted specific versions of RedHat.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Debian users may be lazy and rely on Apt,

Laziness is only a part of it, there's no reason for me to have to deal with compiling everything and having tons of space wasted with -dev packages that I'll only need once. Good package management is one of the biggest benefits that Linux has over most unixes, it would be stupid not to exploit it.

I personally don't know why you would bother with RPM or Apt.

Because it's simple and lets someone who knows the packages a lot better than I deal with the build problems. I probably wouldn't have gotten through setting SASL and Cyrus IMAPd if it weren't for the Debian packages just because it's too much of a PITA.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: chsh1ca
I think a simple failure of your assumption is that more advanced linux users really use a lot of package management software in the first place. Debian users may be lazy and rely on Apt, but personally, I like to control where everything is put, so I prefer to compile from source, and rely on binary distributions only as a last-resort crutch. I know a lot of people who are of a similar mind. If absolutely necessary I would rather rely on something like slack's keep-it-simple tarball approach. I personally don't know why you would bother with RPM or Apt. Maybe I should make a thread asking people to tell me why to use package management at all. :D

At any rate, the times I have used RPM, dependencies were never an issue, mainly because I could read and only downloaded RPMs from people who obviously targeted specific versions of RedHat.

Why use package management? Because sometimes you don't have a couple of days to get everything installed and working.

Although, I am not the happiest Debian user at the moment. It took me a while to get something usable with the system. But much of that was my own ignorance and impatience ;)
 

rjain

Golden Member
May 1, 2003
1,475
0
0
I also like building my own packages to help manage stuff I want to install. Of course, that rarely happens, since most of that stuff is already in debian or is being added to debian by myself or one of my friends. ;)
Not everyone has the time to continuously check for fixes for every package they install and then integrate the package with the rest of the stuff on the system (window manager and display manager menus, documentation listings, etc.). If someone else can do the job just fine, there's no need for me to go and do it all over again.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: Nothinman
Laziness is only a part of it, there's no reason for me to have to deal with compiling everything and having tons of space wasted with -dev packages that I'll only need once. Good package management is one of the biggest benefits that Linux has over most unixes, it would be stupid not to exploit it.
You're talking about something different. Apt is for package management and distribution, you're discussing compiling from source. Binary tarball is a definite option with pretty well all software.

Because it's simple and lets someone who knows the packages a lot better than I deal with the build problems. I probably wouldn't have gotten through setting SASL and Cyrus IMAPd if it weren't for the Debian packages just because it's too much of a PITA.
Again separate issues. The idea behind apt and RPM is to keep dependancies sorted out, and track what software is installed where. That's amazingly handy for a desktop, where dependancies are likely to be an issue. When you start to use apt as you do -- nothing more than a replacement for tar -- it becomes utterly redundant IMO.

Originally posted by: n0cmonkey
Why use package management? Because sometimes you don't have a couple of days to get everything installed and working.
Still talking about separate issues.
 

cleverhandle

Diamond Member
Dec 17, 2001
3,566
3
81
Originally posted by: Sunner
Well, RPM in itself isn't too bad, it's the lack of a good(or rather, as good) frontend that makes it inferior to DEB or EBuilds.
Agreed. For that reason, I don't see apt-rpm as "cheating" or "defeating the purpose." RPM has, for quite some time now, been targetted as a low-level package manager, much like dpkg. That's why it can use so darn many switches and options. But Red Hat never really provided a compelling high-level frontend for it. There was Gno-RPM for a while, which I thought was pretty decent, but they ditched that and went with redhat-config-packages in RH8, which does handle dependencies but dumbs everything down to the point of uselessness for a power user. Apt-rpm and synaptic is the front end that should have been pushed by Red Hat, but I suspect that would threaten their revenues from up2date too much to be made the primary package manager.

Also, as alluded to by Drag, RPM's are a better choice in a corporate environment where you're unlikely to have a lot of complex dependencies. RPM's are much quicker to build and especially to modify than a deb package. And in situations where you delegate authority for updates or installation to a less-technical manager type, the non-interactive nature of RPM's is a good thing. Just "rpm -Uvh foo" and the package sets itself up with no fuss. And if you don't like the default configuration, it's simple to tweak the package to work the way you need it to.

 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
You're talking about something different. Apt is for package management and distribution, you're discussing compiling from source. Binary tarball is a definite option with pretty well all software.

But most don't distribute binaries unless they're in a package format unless it's a closed source app.

The idea behind apt and RPM is to keep dependancies sorted out, and track what software is installed where. That's amazingly handy for a desktop, where dependancies are likely to be an issue.

And you're saying dependencies aren't an issue on a server? Keeping your daemons in sync with the version of libssl isn't important? Or using Cyrus IMAPd again, keeping it in sync with the version of SASL it was compiled against isn't important?

When you start to use apt as you do -- nothing more than a replacement for tar -- it becomes utterly redundant IMO.

apt and debconf are far more than replacements for tar, the comparison is even laughable.

Still talking about separate issues.

Maybe I'm confused but I don't see how they're seperate issues.
 

chsh1ca

Golden Member
Feb 17, 2003
1,179
0
0
Originally posted by: Nothinman
But most don't distribute binaries unless they're in a package format unless it's a closed source app.
They don't? Almost all software I've seen for linux is available in both binary and source tarballs (generally gzipped, sometimes bzipped).

And you're saying dependencies aren't an issue on a server? Keeping your daemons in sync with the version of libssl isn't important? Or using Cyrus IMAPd again, keeping it in sync with the version of SASL it was compiled against isn't important?
No, I'm saying that on a server they aren't *AS MUCH* of an issue. In fact, your points are more of an argument for compiling from source than they are for dependancy managers. :) Not only that, but on a server you should have a much clearer definition of where software is installed anyways. By and large most binary distributions via apt or rpm can't be chrooted, and there are other similar considerations. This is all of course ignoring the fact that anyone installing a server really should be developing a common setup for their servers anyway. If Debian defaults work for you, then sure they're probably best. That being said, I don't think ANY of the major distributions work well in a securable server, which is why I think OpenBSD will continue to have a better track record for security.

apt and debconf are far more than replacements for tar, the comparison is even laughable.
I am aware of that, however from what you describe as doing with apt, you are basically just merging the download and untar procedures into one step. Apt in and of itself is powerful, if used properly, however you are relying on another person to develop the package for you. Sure, it may save you a lot of hassles in one sense, but in another it creates some.

Maybe I'm confused but I don't see how they're seperate issues.
Binary vs source distribution, and package management vs non. It's a fine line.
Btw, I'm not saying either is right (using apt vs not using it), just that it is something that I don't know a lot of people who are 'advanced linux users' do. If the majority of anandtech's advanced linux users respond and say they use it then empirically I'm wrong to state it in a general sense, but keep in mind, I am only discussing from my experiences. :)

 

rjain

Golden Member
May 1, 2003
1,475
0
0
If you define "advanced" as "spends all his time recompiling and patching", then maybe no advanced linux users use package management.

Edit: Also, if the debian maintainer doesn't have enough free time for your liking, you're free to go ahead and build the latest version of the source and use that. And if you have enough free time, you could offer to take over maintainance of the package.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
If you define "advanced" as "spends all his time recompiling and patching", then maybe no advanced linux users use package management.
I agree.
Granted, I use Gentoo so obviously I don't mind compiling in itself, rather it's the automated process Im after.
I don't wanna sit around all day hunting dependancies, etc, I don't mind if my workstation sits and compiles stuff in the background though.

It's kinda like all the people who feel the need to compile custom kernels when using OpenBSD, rather than sticking with GENERIC, in 95% of the cases, there's little to no gain, but alot of potential trouble.
 

Spyro

Diamond Member
Dec 4, 2001
3,366
0
0
Originally posted by: chsh1ca
I think a simple failure of your assumption is that more advanced linux users really use a lot of package management software in the first place. Debian users may be lazy and rely on Apt, but personally, I like to control where everything is put, so I prefer to compile from source, and rely on binary distributions only as a last-resort crutch. I know a lot of people who are of a similar mind. If absolutely necessary I would rather rely on something like slack's keep-it-simple tarball approach. I personally don't know why you would bother with RPM or Apt. Maybe I should make a thread asking people to tell me why to use package management at all. :D

At any rate, the times I have used RPM, dependencies were never an issue, mainly because I could read and only downloaded RPMs from people who obviously targeted specific versions of RedHat.

Without automated package management, there are a lot of people that wouldn't even give linux a second glance. It's like if the only way to install windows programs was by manually placing all of the files that are used by the program. Of course it isn't that bad in Linux, but for the typical home user, who uses Linux to surf the web, type up office docs, and play the occasional game, having to recompile for each new version of the program just wouldn't be worth it.

I'm not really the typical home user, but I don't have time to hunt down dependencies and then manually compile everything. Why do it yourself when someone else can do it for you. I have more important things to do than tear out my hair while trying to figure out what version of lib* that some particular program uses.
 

cleverhandle

Diamond Member
Dec 17, 2001
3,566
3
81
As others have stated, compiling from source seems to me like a huge waste of time in the vast majority of cases. Once in a while I may have compiled a package differently, but even then I'd rather tweak and rebuild the package just for the sake of clean uninstalls. Server software tends to have fewer complexities - no desktop files, menu entries, doc files, and that kind of stuff. But then it also has fewer options that are worth recompiling for.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Not only that, but on a server you should have a much clearer definition of where software is installed anyways.
Not really, I care where the mail spool or web root is placed because I have to make sure the permissions are right and that it's backed up regularly, but why should I care if the binary is in /usr/local/apache as opposed to /usr/bin/apache?

I am aware of that, however from what you describe as doing with apt, you are basically just merging the download and untar procedures into one step. Apt in and of itself is powerful, if used properly, however you are relying on another person to develop the package for you. Sure, it may save you a lot of hassles in one sense, but in another it creates some.

apt handles a lot more than just deciding which packages to download. And really you're relying on another person to develop the software aren't you? Quick better rewrite everything yourself, you never know what bad things the author of one of those daemon's you're running has setup.