Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
What's a good cron script for backing up and zipping a directory of files, or multiple directories with files, to a backup directory on my server, on a daily basis?
I found an easy to use mysql backup script, now I need to backup my site directory, but not all the directories in it. So I need a method in the script to omit certain directories from backing up, ie dirs that contain gigs worth of files.
This seems like it should be one of the most common crons to set a server up with but two pages deep in google (and here) I have yet to find anything remotely resembling a solution.
I'm using flexbackup (http://www.edwinh.org/flexbackup/) which is a perlscript. It has a configurationfile in the /etc-directory and one can define sets of directories to backup and one can do full, differential and incremental backups. Since I'm not running a Linux-server I manage the backups with the anacron-daemon.
Last step is to create an automated method to restore the website. In lieu of another catastrophe, instead of having to drop the current sitedir/db, then untar the chosen sitedir/db, inject the db and copy the sitedr all manually... instead have a script that'll backup the current site/db and then drop the current sitedir/db then untar and restore with the sitedir/db backups of a given date, all with one command.