UbuntuThis forum is for the discussion of Ubuntu Linux.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
You don't want to backup the root directory. Back up the directories that need backing up. For example, if you backup up /proc, you will be backing up the entire core and pseudo files which will never be restored. The files in /dev are created when you boot up. The files in /sys are pseudo files created by the kernel when they are read. Backing up /tmp is wasteful. Many people have /tmp cleared out when the reboot.
Also don't back up /media or /mnt. You will be backing up external filesystems, maybe even the filesystem you are backing up to.
Also look at the options for tar. Make sure that you are backing up symbolic links as symbolic links and not dereferencing then. That can lead to an increase in size of the backup. Instead of backing up the location of the file, you will be backing up the file itself, creating a copy of the file. The copy of the file won't change when the original does.
Another thing to consider is how sparse files are backed up. You want to back them up as sparse files and not with zero's filling in the unused space.
I see that now. Can you tell me what files/directories I should back up in case of system failure? I think /home/joe ? What else would I backup if i want to restore my system to the way it is now?
Well, if you can rely on re-installing, you don't need that much. (To rely on re-installing, you probably want to treat your install disks as precious objects; when it comes down to it, you may, or may not, want to re-install the exact same version, but having the option is good).
(The alternative of a block-by-block copy on to the new system could be problematic if the new system isn't exactly the same as the old one, so I don't like relying on this; some people think its bound to be 'fixable' though; I'm not sure, but I am sure that if I rely on this, it will be problematic (that word again!) to get it done in zero time.)
Really you need he home directories for all of your users (is /home/joe everything?) and you probably want to keep a copy of /etc for reference. I certainly don't advise that you just overwrite your new /etc with the old one - that is likely to be problematic, but, if you have difficulty getting a service configured that you had once succeeded with, having the old .conf is potentially helpful.
Really you need he home directories for all of your users (is /home/joe everything?) and you probably want to keep a copy of /etc for reference. I certainly don't advise that you just overwrite your new /etc with the old one - that is likely to be problematic...
I am the only user, so yes, /home/joe is the only one. So if I understand you correctly, I should backup that and the /etc directory. That sounds nice and easy, but why would it be "problematic" to just replace the old /etc with the new one? As a new Linux user I am always trying to learn as much as possible.
I am the only user, so yes, /home/joe is the only one.
...hmm, yes, but bear in mind that if you used the root account (so probably not that applicable for *buntu users), the root user might have some scripts in his bin subdirectory that would also need backing up.
but why would it be "problematic" to just replace the old /etc with the new one?
In the case that your new install is of a newer distro (or just a different distro) you would be overwriting the config files for the newer versions of the various services with (your hand tuned) older ones. That might work, it might be a disaster, so you are better off not comitting to doing it.
So you might restore your backed up /etc to something like /etc/old/.... You'd the look at the services that you are running (squid, bind, nfs,...) one by one.
Check version numbers. If you are comparing, eg, squid 2.4.1 with squid 2.4.1, there is quite a good chance that your old config would work
If you are comparing squid 3.0.1 with 2.4.1, the chances are much lower that it will 'just work' without further attention
Even then it is far from guaranteed: in particular, with squid, there are many config options that may or may not be built in to a particular build of squid. The end result is that a .conf for one build of squid 2.4.1 may be incompatible with a different build of 2.4.1 (and if you specify an option that your build doesn't know about, it just bombs out).
If you are getting this kind of problem, you want to deal with them one-by-one rather than just having a massive mess of errors from a large number of different services while nothing is working. For example, if you break something in networking, you might be unsure of whether other things that rely on networking that break are broken in themselves, or are only broken because some other service on which they rely is broken.
And, obviously, you wouldn't just delete the installed config files; you'd rename/move them, just in case, that you did need them, even for comparison.
In my home dir I keep copies of any configuration files from /etc that I edited. I also keep copies of any scripts I may have written and stored in /usr/local/bin. You'll find a number of occurances of dir /local or .conf.local which are intended for the user to make changes and be comfortable those files wouldn't be replaced during an ap upgrade. I save copies of those as well in my home dir.
If you are starting from a fresh install, using dd to create an image backup would enable you to recover the quickest from a drive failure to a state you were in when you initially started. You may want to zero out a previously used drive before a fresh install. That way, you can pipe the output of dd through gzip or bzip2 to compress the image to a more manageable size. You can also use the `df' program to tell you how much space is left in a partition, and then set dd's "count=" argument appropriately to fill unused space with a zero filled file. Delete the file afterwards. Now the partition will compress nicely.
Or your first backup could include all of the directories except those that you don't want to backup. The final arguments to tar can contain all of them. You could create one large initial backup using tar and afterwards use the -g option to only backup new files. I would recommend backing up the /home partition separately. Also, most people create a separate partition for home. This is where all of your personal files will be in. You can even rename it, reinstall (not formatting /home), and pull what you want from your old /home/user directory.
Look in the tar info file, and search for "incremental dumps". There is also an example using tar on both sides of a pipe to replicate files from one directory to another. One thing I've tried was something like:
tar -C /home -g timestampfile -cf - | tee /mnt/disk/backups/backup.tar | ssh user@host tar -C /home -xvf -
This replicates new files from home on one computer to another (even over the internet) while simultaneously creating an incremental backup on an external drive.