Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
I backup my home (almoust) exactly in the same way. I found it is useful to cd $HOME and to tar thereafter, because in this case you get relative paths in the archive. This is useful if extracting to another place. Note that the excluded directoris must be specified accordingly.
I wrote it as a ascript, in this case it would be trivial to use bzip2 -9 after tarring. or what do you exactly mean as "compressing files even more".
---- my archiving script
if test "$1" == "-v" ; then
echo "Arhiivin $HOME -> $ARHIIV"
tar $V -czf $ARHIIV\
--exclude .Trash --exclude a --exclude 80274378.key\
I'm not certain but I think you will need to use a pipe to be able to use the -9 (max compression) option.
You will be able to decompress using the j flag in tar however.
This will probably take experimentation to find the right balance between the compression level and the amount of time it takes to perform a backup.
One other thing is whether you want to exclude all of the .kde or .gnome directories. You may find that you want your preferences backed up, but exclude any cache type subdirectories such as icons and internet caches.
If you have very large backups, you might want to forgo compression alltogether. Especially if the majority of the files are in a format that are compressed to begin with such as jpegs or mpegs or downloads.
If you've got a second hard drive, I find it useful to periodically copy over my entire /home directory to that second drive. If I delete I file I later need from my "real" /home, it's nice to have a local copy. Obviously this isn't really a robust backup system, but then again I don't have any highly critical data -- J.W.
The problem I have with tar backups is this: tar makes the entire tarball as one compressed file. If any part is degraded in any way, you have essentially lost the entire backup. I searched for the last 2 years for backup options; tried many; finally settled on Dar. I get compressed backups (gzip or bzip2), each file compressed individually. If part of one is degraded/corrupt in any way, I still have the rest. Plus, dar can do a skip-ahead in restoring a corrupt file, so that only a small part of the file is lost.
You can find dar here
Originally posted by J.W. If you've got a second hard drive, I find it useful to periodically copy over my entire /home directory to that second drive. If I delete I file I later need from my "real" /home, it's nice to have a local copy. Obviously this isn't really a robust backup system, but then again I don't have any highly critical data -- J.W.
I'm in the same situation, and do the same thing--and it has saved me much grief time after time. Highly recommended to do this /home backup to a second drive even if you use some other method of backing up other files.
Here is how my command reads to do it, along with comments I put in to remind myself of what some of it means. I found the neat trick about interpreting dots correctly, here on LQ some time ago!
cp -rupv ./* ./.[^\.]* /prime_backup/home.jon.backup/
##Note: the "[^\.]" prevents cp from interpreting the "hidden file attribute" dot
#either the "current directory" dot or one of the "parent directory" dots by
#specifying that the second character is not to be a dot. For this reason no
#file should begin with two dots, normally.
I was even automating this via a cron job for a while, but that can be a deceptive convenience, as the backup may take place just after you've made some regrettable changes, or while certain files are being accessed with the result that they don't get backed up correctly if at all. So it's better for me to choose for myself when to do this kind of backup.
As for big backups, system-wide in nature, it's worth looking at rdiff-backup, available at
This program creates "differential" backups (everybody's definition of "differential" is apt to be different, but the general result is that there's a series of "snapshots" available enabling restoring files or directories to a state they were in a given length of time ago, if necessary or desired).
It's really a pretty subtle but easy to use program (otherwise I wouldn't be able to use it).
If disk space is limited it may not be so desirable. However, you can trim the number of versions it keeps on hand at will.