Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
I'm running RH9 with many different services available... apache, ProFTPd, qmail, Samba, etc etc.
These services + about 1 gig of user data live on a 30 gig drive with about 6 or 7 gig used that I want to backup. I have purchased a 60 gig USB drive that I have successfully mounted as a SCSI device (sda1). Now I'm wondering what sort of backup software to start using.
I have been told that Mondo does very good for backing up the whole thing, but I have tried using this and I'm unsure if I will be able to extract individual files from the backup to restore. It almost seems like Mondo is really only good for backing up and restoring the whole shebang and will not be able to restore individual files. Is that T/F?
I have looked at Norton Ghost, but this also seems to be the same deal.
Now I'm considering just doing a simple backup of the whole root directory (/) using tar and sending the tar file to the 60 gig drive. This drive has very little else on it, so obviously free space isn't a problem.
What are some of the caveats to running just a simple backup like this using tar. Will the permissions/ownership be intact if I restore some of the files from the tar backup? Are there any critical files that I should avoid backing up? Any other pitfalls I should know about?
I'm thinking about doing daily backups using tar, keeping the most recent 10 day's worth, and doing monthly's using Mondo just for good measure. Does this seem like a good strategy for me? Will I be able to restore corrupted programs (apache, ProFTPD, etc) from a tar file that I have made??? This is still a devel server, so I'm likely to break something and be in need of a quick restore. I'm relatively new to Linux and not sure what to expect.
I got myself a copy of the RPM from their sourceforge site, but when I installed it says I need libbz2.so.1.0 before I can install dar. When I looked up libbz2.so.1.0, I found that it was part of the bzip2 library. Now when I run... rpm -q bzip2, it says my version is bzip2-1.0.2-8 and is already installed. So I take this to mean I already have everything I need.
When I went to the bzip2 official site (http://sources.redhat.com/bzip2/) I found nothing but some executables that are for an older version of RH (like version 7 or something). Are these safe to use? How should I go about using them? There were scant few instructions on how to install them.
I'd really like to take advantage of dar, since the instructions seemed very good (which is important for me because I'm a noob) but this is quickly becoming more trouble than its worth.
Well, I downloaded the tar file and installed it that way. No problems at all. I guess I should have tried it that way from the start, but I really like how cleanly RPMs tend to install... this was a notable exception.
Anyway, I have run a full backup, restored a couple of test files, which went fine, and now I'm going to begin looking at diff backups run via cron on a daily basis. I've got a tutorial from their website which I'm sure will do fine, but I figured I would report back and let you know it's working great and also to ask if there are any scripts you may have that might help me. Looks like most of the scripts that people have written are just variations of the ones provided in the examples of the howto.
Thanks for the excellent advice. I've tried many different backup programs and none of them seem to be as nice as this.
Also.... I am looking at the -y option that enables bzip compression. Is that any better than just using the normal -z compression? What are the advantages of using one over another?
Also, I was looking at the -Z option that allows me to disclude certain files from being zipped. The tutorials said that it doesn't make sense to use the -Z option unless you're using the -y option. Why is that???
Distribution: RH 6.2, Gen2, Knoppix,arch, bodhi, studio, suse, mint
i don't have any special scripts, but i like the way you can seperate the
catalogs from the backups, so that you can do differential backups without
accessing the full backup. dump does that with the dumpdates file, i think.
Distribution: RH 6.2, Gen2, Knoppix,arch, bodhi, studio, suse, mint
bzip2 compresses better than gzip, but is a lot slower. gzip and bzip2
take numbers for compression level 1 being lowest, i think, and 9, highest.
you can see how well yourself if you compress a file of your choice with
each at different levels. gzip -1 filename or gzip -9 filename.
gzip -9 is about the same speed as bzip2 -1.
they probably just made a slight error with the -Z and -y, and meant it doesn't
make sense to exclude files from compression when you aren't using
When I run my differential backups, I am automating it to run daily using a script. when I run the script, I always get an error message that comes in my mailbox. "No terminal found for user interaction. All questions will abort the program."
The script is below. I would be grateful if you can advise me on what I can do to stop the messages? I added the /dev/null lines as their tutorial specified. It says it will supress any error messages on output, but it's not working.
I'm actually getting the error twice... once for each time I'm calling the dar program. If I comment out the lines for dar -t it will only return one error message to my mailbox, even though I have the /dev/null lines.
Distribution: Mandrake as base, most software hand rolled
I use a totally different approach, based on hack #74 from Linux Server Hacks. Basically, the method uses rsync and cp with hard links to keep a number of snapshots of the filesystem at various stages. All of the snapshots are totally independent from the original filesystem, and can be used to restore either individual files, or the entire FS, but the beauty of it is that they are true snapshots, in that they only need space for files that have changed since the snapshot was taken. One word of caution: since this uses rsync, it does not handle non regular files (but then neither does tar, completely)·
A perl module implementing this is available here if you want to play with this yourself.