Originally posted by: drag
For simple backups you can use rsync command.
Rsync is a nice command that runs over a ssh connection (optionally can run over plain network although that's a bit insecure in comparision since ssh does encryption, but has less overhead.)
What it does is that it goes thru files with a 'rolling checksum' that compares changes and then updates files accordingly.
Everything your user has write access to is either in your home directory or in the /tmp directory. Obviously /tmp directory contains temporary files which you don't need to worry about.
There is no registry or central location of files for user preferences like there is for Windows. In Linux you wouldn't have write access to stuff like that.. what you have instead is these hidden files and directories in your home directory.
These files and directories have "." (the dot) as the first characters in their names.. so when you do a ls or open a file dialog in some gui app they are skipped over for your convience. You can see them by going "ls -a" instead of regular "ls"
So if you backup your entire home directory using rsync you automaticly back up all your user preferences and such along with your files. However there are a few 'special' files like sockets, pipes, and such but those can usually be ignored safely.
Here is what I do for my home directory to a server...
rsync --archive -v /home/drag/
drag@servername.whatever:/home/drag/
Or something like that.
What I use now is something called unison.
It's usefull for syncronising between 2 computers. I use it for my laptop and desktop so that I have all the same preferences, bookmarks, work files, etc etc on both computers.
There are some issues, like I have in the session-preferences section to run update-menus so that it regenrates the menus each time I log in since I don't have all of the same stuff installed on both computers.
There is a unison-gtk GUI front end on it that makes it a bit easier to setup. First time you run it it takes a long time, but once it gets it's database of files and their checksums going then updates are very fast since it updates changes only like rsync does. Goes over ssh for security but I think you can setup as a deamon for running just over the network.
You'd have to have it installed on both machines.
For traditional tools you have things like DD, which you can make images of file systems and output them to different things, like tapes. Also there is 'dump' which is a similar item.
You have 'tar' which you can make tarballs, which are basicly like zip files.
You can use bash scripts and such to use traditional tools for backing up over a network..
on the remote backup machine machine...
$ cd /backup/directory/
$ nc -l -p 8000 | tar xfv -
on local machine...
$ cd /home/
$ tar cfv - username/* | nc xxx.xxx.xxx.xxx 8000
Were xxx.xxx.xxx.xxx is either the dns name for machine or ip address or whatnot.
I like to use the tar and netcat trick when I move to a new harddrive or whatnot. I boot up with knoppix or whatnot, copy over the entire OS to another machine.. put in a new harddrive, copy it back over, then reinstall grub and make sure /etc/fstab is up and running and everything.
There are lots of tricks and thigns like that. You can find examples all over the network.
For more complex automated stuff you have things like Amanda and such. There are a lot of things...
http://www.amanda.org/
Amanda is nice for if you have a lan of computers and want to backup all of them to a single big machine. It uses normal utilities like dump and tar to build backups.
If your running Ubuntu with extra repos setup you can do a apt-cache search backup for back up software and such and you'll find a lot of stuff. Each for different purposes and strengths and weaknessses.