Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
What's a good cron script for backing up and zipping a directory of files, or multiple directories with files, to a backup directory on my server, on a daily basis?
I found an easy to use mysql backup script, now I need to backup my site directory, but not all the directories in it. So I need a method in the script to omit certain directories from backing up, ie dirs that contain gigs worth of files.
This seems like it should be one of the most common crons to set a server up with but two pages deep in google (and here) I have yet to find anything remotely resembling a solution.
Thanks, works fine except for the "exclude" option gives me invalid "e" token.
In addition to this I got to make it create a new backup for each day (by date), create a monthly permament backup in separate directory, and delete all daily backups that are older than 30 days.
I'm using flexbackup (http://www.edwinh.org/flexbackup/) which is a perlscript. It has a configurationfile in the /etc-directory and one can define sets of directories to backup and one can do full, differential and incremental backups. Since I'm not running a Linux-server I manage the backups with the anacron-daemon.
Last step is to create an automated method to restore the website. In lieu of another catastrophe, instead of having to drop the current sitedir/db, then untar the chosen sitedir/db, inject the db and copy the sitedr all manually... instead have a script that'll backup the current site/db and then drop the current sitedir/db then untar and restore with the sitedir/db backups of a given date, all with one command.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.