Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
1) How do you do your backups? I have script that will run this command, only use the current date as the tarball name.
2) Can I make bzip2 compress the file even more, I don't care how long it takes to compress. Do I have to split it into 2 commands, or pipe the output to do this?
3) What directory do you save your backups to?
Here's the way I came up with by reading man pages. I'm pretty sure it is able to exclude the *directories* (correct me if I'm wrong) in the exclude file.
Code:
$ tar -cjv -X backup_exclude_paths -f jake.tar.bz2 /home/jake/
I backup my home (almoust) exactly in the same way. I found it is useful to cd $HOME and to tar thereafter, because in this case you get relative paths in the archive. This is useful if extracting to another place. Note that the excluded directoris must be specified accordingly.
I wrote it as a ascript, in this case it would be trivial to use bzip2 -9 after tarring. or what do you exactly mean as "compressing files even more".
best,
Ott
---- my archiving script
#!/bin/bash
if test "$1" == "-v" ; then
V="-v"
echo "Arhiivin $HOME -> $ARHIIV"
else
V=""
fi
cd $HOME
ARHIIV=/mnt/otoomet/$HOSTNAME.tar.gz
tar $V -czf $ARHIIV\
--exclude .Trash --exclude a --exclude 80274378.key\
.
I'm not certain but I think you will need to use a pipe to be able to use the -9 (max compression) option.
You will be able to decompress using the j flag in tar however.
This will probably take experimentation to find the right balance between the compression level and the amount of time it takes to perform a backup.
One other thing is whether you want to exclude all of the .kde or .gnome directories. You may find that you want your preferences backed up, but exclude any cache type subdirectories such as icons and internet caches.
If you have very large backups, you might want to forgo compression alltogether. Especially if the majority of the files are in a format that are compressed to begin with such as jpegs or mpegs or downloads.
If you've got a second hard drive, I find it useful to periodically copy over my entire /home directory to that second drive. If I delete I file I later need from my "real" /home, it's nice to have a local copy. Obviously this isn't really a robust backup system, but then again I don't have any highly critical data -- J.W.
The problem I have with tar backups is this: tar makes the entire tarball as one compressed file. If any part is degraded in any way, you have essentially lost the entire backup. I searched for the last 2 years for backup options; tried many; finally settled on Dar. I get compressed backups (gzip or bzip2), each file compressed individually. If part of one is degraded/corrupt in any way, I still have the rest. Plus, dar can do a skip-ahead in restoring a corrupt file, so that only a small part of the file is lost.
You can find dar here
Originally posted by J.W. If you've got a second hard drive, I find it useful to periodically copy over my entire /home directory to that second drive. If I delete I file I later need from my "real" /home, it's nice to have a local copy. Obviously this isn't really a robust backup system, but then again I don't have any highly critical data -- J.W.
I'm in the same situation, and do the same thing--and it has saved me much grief time after time. Highly recommended to do this /home backup to a second drive even if you use some other method of backing up other files.
Here is how my command reads to do it, along with comments I put in to remind myself of what some of it means. I found the neat trick about interpreting dots correctly, here on LQ some time ago!
Code:
cd /home/jon
cp -rupv ./* ./.[^\.]* /prime_backup/home.jon.backup/
##Note: the "[^\.]" prevents cp from interpreting the "hidden file attribute" dot
as
#either the "current directory" dot or one of the "parent directory" dots by
#specifying that the second character is not to be a dot. For this reason no
#file should begin with two dots, normally.
I was even automating this via a cron job for a while, but that can be a deceptive convenience, as the backup may take place just after you've made some regrettable changes, or while certain files are being accessed with the result that they don't get backed up correctly if at all. So it's better for me to choose for myself when to do this kind of backup.
As for big backups, system-wide in nature, it's worth looking at rdiff-backup, available at
This program creates "differential" backups (everybody's definition of "differential" is apt to be different, but the general result is that there's a series of "snapshots" available enabling restoring files or directories to a state they were in a given length of time ago, if necessary or desired).
It's really a pretty subtle but easy to use program (otherwise I wouldn't be able to use it).
If disk space is limited it may not be so desirable. However, you can trim the number of versions it keeps on hand at will.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.