Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
this is rather a philosophical question, I guess: I was sitting around with a few friends of mine, and it turns out we all use a Linux machine as a file server.
Now, so much for the similarities. We discovered that each of us uses a different method to backup the file server. I simply have a script which creates a tar.gzip of the files and SCPs it to a backup machine. One of my friends uses a script which uses FTP to put the files on a backup machine, a third guy simply uses dd. And so on.
Now, we started arguing about the very best method to backup a file server. I do admit that mine is a bit amateurish, as I need to reinstall my machine after a crash before I can restore the data and go on working :-) So we argued all night about the very best way to do this - and came to no real result. Anyone here who has some experience on how this should be done best?
You need not tar the files every day and scp to a different server. It is better to use rsync the files to the remote location. rsync will first create the list of files and transfer to the remote location. Next day before transferring the files, it will check for the list of files in the folder. If there are no changes in any of the files, rsync will not take the burden to copy those files. It will copy only the new and updated files making the entire process simple and faster.
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
Arguments over what is best will always go all night.
If you are interested in digging into alternatives and possibilities, and learning a bunch about backups along the way, get the book Backup and Recovery by W. Curtis Preston. It has a companion web site http://www.backupcentral.com/.
The book covers all the basic tools like tar, dump, dd, rsync, etc., as well as more advanced network backup software such as Amanda and Bacula.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.