Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Guys,
Trying to create a backup script on 2 machines to back up to an array, which is backed up nightly. The 2 workstation machines are running 32bit RHEL and the version of Netbackup we use here doesn't support 32bit.
So, we had the idea of creating a tarball of the directories my guys need backed up, then setup a cron job to scp those to a directory on a 64bit RHEL server, which is backed up.
What would be the best way to set this up?
I reckon we would need to delete the tarball after it has been scp'ed to the server (for disk space reasons).
Or you could skip the tarball entirely and just rsync the directories over.
Or you could mount the backup location on the server via NFS and just dump the tarball straight to its final location without having to create an intermediate copy on the local machine.
The problem with tarballs is that you have to redo the ENTIRE backup process EVERY time. With something like rsync you can do incremental backups, so the first one takes forever but all future ones run very quickly. You can even use the --link-dest flag in rsync to make hard links to unchanged files so you can have full, separate backups that only use the space of the files that changed. Plus with rsync your backups are live and navigable, which makes them much more convenient to use.
Last edited by suicidaleggroll; 12-18-2014 at 03:32 PM.
Haven't set up keys yet.
Do you think rsync would work better in my situation since disk space would likely be an issue on down the road? The tarball would be ~275gb on a 1TB drive. 275GB is already in use on this drive.
If you're rsyncing the files straight to the server, then no additional space will be used on the local machine. You only need to worry about available space on the machine when you want to make a giant local tarball (275 GB is insanely unmanageable BTW) and then copy it over, which is a terribly inefficient way of doing things.
Or you could skip the tarball entirely and just rsync the directories over.
Or you could mount the backup location on the server via NFS and just dump the tarball straight to its final location without having to create an intermediate copy on the local machine.
The problem with tarballs is that you have to redo the ENTIRE backup process EVERY time.
No - tar supports incremental backups as well. See the -g option in the tar man page.
It is POSSIBLE to use tar to create a multi-file archive. You have to use the -L option to specify how long you want the file, then use the -F option to run a script at the end of each file. This should allow you to rename the current output, and resume with a new output file being created.
I will admit to having tried this... But if you fear large tarballs this is one way to break them up. Another is to use the split utility (like "tar cxf - . | split ...") which will also work.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.