Setting up a linux webserver

anthony88guy

Senior member
Feb 3, 2005
220
0
0
I want to turn my old 750mhz Duron, 512mb RAM, 26GB HDD, computer into a web server. I?m going to use Ubuntu Linux Distro, with I think GNOME GUI. Since I currently have windows 98se on that computer I want to do a clean install of Linux. But my only question is will have I driver issues?
 

Hyperblaze

Lifer
May 31, 2001
10,027
1
81
If you install the latest version of Ubuntu, then you have the least chances of driver issues.

So far you haven't listed any hardware which might cause any sort of issues. Why not install ubuntu and find out?
 

anthony88guy

Senior member
Feb 3, 2005
220
0
0
Im waiting for the CD's they are mailing me. But thats what its going to come down to. Im very new to linux so I dont know.
 

Hyperblaze

Lifer
May 31, 2001
10,027
1
81
Originally posted by: anthony88guy
Im waiting for the CD's they are mailing me. But thats what its going to come down to. Im very new to linux so I dont know.

if you have a burner, you can actually download images of the cds (called ISOs) and burn them.

That would get you the cds in a matter of hours (depending on your net connection) instead of days
 

anthony88guy

Senior member
Feb 3, 2005
220
0
0
Originally posted by: Hyperblaze
Originally posted by: anthony88guy
Im waiting for the CD's they are mailing me. But thats what its going to come down to. Im very new to linux so I dont know.

if you have a burner, you can actually download images of the cds (called ISOs) and burn them.

That would get you the cds in a matter of hours (depending on your net connection) instead of days

Yeah, I saw you can do that. But Im gonna order a spliter so I can hook up both computers to one montier, keyboard, mouse. Also I have to get a network cable.
 

Hacp

Lifer
Jun 8, 2005
13,923
2
81
Originally posted by: anthony88guy
Im waiting for the CD's they are mailing me. But thats what its going to come down to. Im very new to linux so I dont know.

Took them 3 months for them to mail me the cd.
 

anthony88guy

Senior member
Feb 3, 2005
220
0
0
Originally posted by: Hacp
Originally posted by: anthony88guy
Im waiting for the CD's they are mailing me. But thats what its going to come down to. Im very new to linux so I dont know.

Took them 3 months for them to mail me the cd.

Im gonna start downloading then...Thanks alot.
 

wkinney

Senior member
Dec 10, 2004
268
0
0
if its going to be just apache, why do you need gnome? downloading and compiling apache from the command line is easy and lot more intuitive than from a gui. also it would negate the need for a KVM, since you could just SSH in.
 

Hyperblaze

Lifer
May 31, 2001
10,027
1
81
Originally posted by: wkinney
if its going to be just apache, why do you need gnome? downloading and compiling apache from the command line is easy and lot more intuitive than from a gui. also it would negate the need for a KVM, since you could just SSH in.

to us maybe the command line is more intuitive then a gui. It sure is more flexible.

But he is very new to linux. hense, he wouldn't have a clue on what to do with CLI.

Small steps.

 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
downloading and compiling apache from the command line is easy and lot more intuitive than from a gui. also it would negate the need for a KVM, since you could just SSH in.

Compiling nothing is intuitive, you need a lot more packages installed (compilers, dev headers, etc) and deciphering compiler errors can be difficult for even someone who know what they're doing. it's infinitely simpler to type 'apt-get install apache2' or 'aptitude install apache2' or even run aptitude and browse around the package lists.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: Nothinman
downloading and compiling apache from the command line is easy and lot more intuitive than from a gui. also it would negate the need for a KVM, since you could just SSH in.

Compiling nothing is intuitive, you need a lot more packages installed (compilers, dev headers, etc) and deciphering compiler errors can be difficult for even someone who know what they're doing. it's infinitely simpler to type 'apt-get install apache2' or 'aptitude install apache2' or even run aptitude and browse around the package lists.

There is no reason not to have development packages installed on the system anyways. But using the native package management is always better.
 

anthony88guy

Senior member
Feb 3, 2005
220
0
0
Well I downloaded and burned Ubuntu 5.04 "The Hoary Hedgehog" in under 15mins :) Later on I will start installing it, and see how it goes. Thanks Guys.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
There is no reason not to have development packages installed on the system anyways

Yes there is, it's a waste of disk space and it just makes compiling exploits easier if someone breaks in.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: Nothinman
There is no reason not to have development packages installed on the system anyways

Yes there is, it's a waste of disk space and it just makes compiling exploits easier if someone breaks in.

Disk is cheap, and it only takes a couple of minutes longer to compile if you have to download gcc binaries or build elsewhere.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Disk is cheap, and it only takes a couple of minutes longer to compile if you have to download gcc binaries or build elsewhere.

But depending on how much access they already have downloading and installing all of those libraries and tools might not be possible or easy. Essentially, you shouldn't give them anything they don't absolutely need.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: Nothinman
Disk is cheap, and it only takes a couple of minutes longer to compile if you have to download gcc binaries or build elsewhere.

But depending on how much access they already have downloading and installing all of those libraries and tools might not be possible or easy. Essentially, you shouldn't give them anything they don't absolutely need.

I haven't tried it, but I'm guessing it'd be possible to install gcc in non-root required areas ($HOME). And compiling binaries in other places and transferring the files wouldn't be hard.

You should focus more on keeping them out than restrictions that do nothing in the long run once they've gotten on the system. ;)
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
It's not about where you're focusing, it's about covering all of your bases and layering your security properly. Why give them a compiler if you don't absolutely need to? Yes, make them jump through some extra hoops and you might make them waste enough time to get caught. Or more likely it'll just be some script kiddy that doesn't understand why his rootkit won't install and he'll just move on.
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
Compiling code is silly if your using a system that already has perfectly good binaries and support system aviable. It's not so bad if you have a automated way to do it, but then it's not much different then binary packages anyways. The major difference then is that instead of compiling code on one or two machines and providing it to everyone, your compiling code on thousands of machines and distributing source and making end-users use up much more disk space then before.

For example a long time ago I had a laptop I couldn't install Gentoo on becuse X.org + all the patches took up many multi-gigs of disk space to do the patch, compile, and install routine.

Not a huge deal.

When using a distro it's usually best to use the distro's binaries, unless the distro suck or is something like Gentoo which uses packages that automate patching and compiling code. If the distro sucks then you should be using something else.

You can get many assurances.. such as you know that the versioning between disparent software that you have to use to build your websites are compatable and are kept up to date. You should have a security team in the distro that specializes in keeping up to date on security information and vunerabilities regarding your software packages.

If you compile from scratch everytime there is a patch or a update needed you have to keep on the ball, keep track of mailing lists and other sources of information so that you can download the security fixes yourself, patch your code, then recompile.

Which can get old.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: Nothinman
It's not about where you're focusing, it's about covering all of your bases and layering your security properly. Why give them a compiler if you don't absolutely need to? Yes, make them jump through some extra hoops and you might make them waste enough time to get caught. Or more likely it'll just be some script kiddy that doesn't understand why his rootkit won't install and he'll just move on.

You layer security to keep people out. Once they're inside they can generally do what they want. If you're doing your job properly (or allowed to do your job properly ;)) in the first place most of the script kiddies that donn't understand what -bash: cc: command not found means won't be getting on the system. Or they aren't given anything worthwhile once there.

Wait, isn't that what you've been saying? Not quite. Keep your webserver chrooted. If there is nothing in the chroot like gcc, perl, etc. there won't be a way to break out of it(*). They also won't have the tools to transer files, unless you've broken your webserver.

But what if they break into SSH? They shouldn't. Don't use passwords, use privsep, deny root logins, and put it on a non-standard port. If they find a root hole, at best they'll probably land in the privsep chroot (/var/empty).

Different approaches for different people. I personally don't like people on my systems without my permission. ;)

Personally, I find the compiler much too useful on my home systems to not include it.

*: Without a _major_ kernel(?) bug.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
Once they're inside they can generally do what they want

You run DOS on all of your servers?

If you're doing your job properly (or allowed to do your job properly )

The latter is usually the bigger problem, getting downtime to apply patches isn't easy.

If there is nothing in the chroot like gcc, perl, etc. there won't be a way to break out of it(*)

Not necessarily true, there have been a few proofs-of-concept that break out of chroot's on Linux. I don't know the specifics or if they even still work, but I wouldn't trust a chroot to contain a determined hacker. And chrooting everything adds maintenance overhead, it's not always a good idea.

But what if they break into SSH? They shouldn't. Don't use passwords, use privsep, deny root logins, and put it on a non-standard port.

Using key auth won't help you any if your users decide to make things easier by setting their key passwords to null. AFAIK that's not something you can restrict at the system level, like you can with password lengths.

Personally, I find the compiler much too useful on my home systems to not include it.

The only thing I ever find myself compiling very often is Linux kernels, if I wasn't messing with suspend2 and things I probably wouldn't ever need a compiler.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: Nothinman
Once they're inside they can generally do what they want

You run DOS on all of your servers?

No, it's just a fact for most servers out there. How many people are truly utilizing SELinux?

If you're doing your job properly (or allowed to do your job properly )

The latter is usually the bigger problem, getting downtime to apply patches isn't easy.

With clustering and HA downtime generally isn't necessary. Unless of course management has decided to hamstring you. Then you're screwed no matter what. ;)

If there is nothing in the chroot like gcc, perl, etc. there won't be a way to break out of it(*)

Not necessarily true, there have been a few proofs-of-concept that break out of chroot's on Linux. I don't know the specifics or if they even still work, but I wouldn't trust a chroot to contain a determined hacker. And chrooting everything adds maintenance overhead, it's not always a good idea.

Every method of breaking out of a chroot I've seen requires root privs. and extra tools.

Maintenance should be minimal once it's setup. Hell, that stuff is default these days, isn't it?

But what if they break into SSH? They shouldn't. Don't use passwords, use privsep, deny root logins, and put it on a non-standard port.

Using key auth won't help you any if your users decide to make things easier by setting their key passwords to null. AFAIK that's not something you can restrict at the system level, like you can with password lengths.

Good point, I think that's one issue that should be addressed, but I can't think of a way to properly do it without patching the key generation programs. Maybe that's the best way.

If you setup SSH agent though the user should only have to type the password in once, so it isn't a big deal. Plus, things like kerberos may take most of the pain away.

Personally, I find the compiler much too useful on my home systems to not include it.

The only thing I ever find myself compiling very often is Linux kernels, if I wasn't messing with suspend2 and things I probably wouldn't ever need a compiler.

I'm compiling constantly. Either things that aren't in the ports tree, or ports that cannot be made into a package and distributed (bad licenses), or ports that aren't distributed as packages as often. I also follow snapshots instead of real releases at home, so my package selections are a bit more sparse. ;)

I'm also learning the art of creating a port. It's useful and a bit fun. :)
 

drag

Elite Member
Jul 4, 2002
8,708
0
0
I've broken out of chroot enviroment. It's pretty easy, but there are ways to mitigate it.

I was working on a Debian install inside of a chroot enviroment from a 'live cd', probably knoppix or something like that. The sshd server was running out of chroot and that was the only access I had to it.

I was at work messing around on it (it was at home), but for whatever reason I needed to get out of the chroot and change something or maybe reboot it or something like that.

The exploit I used was a simple C program I got from a article on how to strenghten chroot enviroments. There was a couple lines that were intentially messed up, but I figured it out and compiled it inside the chroot, could of just as easily compiled it somewere else and used scp or lynx or whatnot to get it, though.

Setting up a secure chroot enviroment is pretty difficult sometimes and there is a big limitations on what you can and can't put in there.

Xen seems to be a better solution, except maybe that it's not so efficient. Looking forward to the day that OpenBSD can join in. ;)

Also on a side note I learned today that propolice/SSP was rewriten by a guy from Redhat and that allowed SSP to be included by default in the up comming GCC 4.1 release.
 

n0cmonkey

Elite Member
Jun 10, 2001
42,936
1
0
Originally posted by: drag
I've broken out of chroot enviroment. It's pretty easy, but there are ways to mitigate it.

I was working on a Debian install inside of a chroot enviroment from a 'live cd', probably knoppix or something like that. The sshd server was running out of chroot and that was the only access I had to it.

I was at work messing around on it (it was at home), but for whatever reason I needed to get out of the chroot and change something or maybe reboot it or something like that.

The exploit I used was a simple C program I got from a article on how to strenghten chroot enviroments. There was a couple lines that were intentially messed up, but I figured it out and compiled it inside the chroot, could of just as easily compiled it somewere else and used scp or lynx or whatnot to get it, though.

Setting up a secure chroot enviroment is pretty difficult sometimes and there is a big limitations on what you can and can't put in there.

That's the thing, you had access to a whole host of things. Obviously it isn't the best solution, but it helps along with a host of other things.

I agree though, in a corporate environment it can make more sense to not install a compiler and the accompanying tools. I personally don't think it's a huge deal, since the machine is ultimately vulnerable once someone gets on it.

But on a home machine I don't think it matters as much. It's better to install gcc and friends than use the same machine for a multitude of functions. ;)

Xen seems to be a better solution, except maybe that it's not so efficient. Looking forward to the day that OpenBSD can join in. ;)

Coincidentally I'm reading a bit about Xen 3.0 right now. 3.0 coupled with the right hardware solutions from AMD and Intel should do the trick. But really, I'm more excited about the imminent G5 support. Now I just need some money... ;)

Also on a side note I learned today that propolice/SSP was rewriten by a guy from Redhat and that allowed SSP to be included by default in the up comming GCC 4.1 release.

Did it require a re-write? That's a shame. I know that whole English to Japanese language barrier was an issue, but I was still glad to read that it's finally been included. Linux might just catch up one of these days. ;)