-   Linux - Newbie (
-   -   cron job - run a backup then rsync results to mounted drive (

lukester 03-28-2012 07:05 AM

cron job - run a backup then rsync results to mounted drive

Just looking for some advice, I have set up a cron job using crontab -e as root that looks as follows,

[root@backup win]# crontab -l
0 23 * * sun tar -zPvcf testbackupSUN.tar.gz /var/www/html
0 23 * * mon tar -zPvcf testbackupMON.tar.gz /var/www/html
0 23 * * tue tar -zPvcf testbackupTUE.tar.gz /var/www/html
0 13 * * wed tar -zPvcf testbackupWED.tar.gz /var/www/html
0 23 * * thu tar -zPvcf testbackupTHU.tar.gz /var/www/html
0 23 * * fri tar -zPvcf testbackupFRI.tar.gz /var/www/html
0 23 * * sat tar -zPvcf testbackupSUN.tar.gz /var/www/html

I am guessing this will work instead of using a path to a script file, question is where will the resulting testbackup<day>.tar.gz file end up?

I then need to rsync these .tar.gz files to a mounted drive, I can do this manually using

rsync -avz /root/sitebackup.tar.gz /mnt/win
but is there anyway to get this into the cron job - I was thinking using a pipe? So for instance

0 13 * * wed tar -zPvcf testbackupWED.tar.gz /var/www/html | rsync -avz /root/sitebackup.tar.gz /mnt/win
I would like to pipe it instead of setting it as another cron job as i am not sure how long it will take to tar the /var/www/html folder as each server has a different number of sites.

Should this work or is there a better, easier, cleaner way to do this? (I am looking into doing an incremental backup using the update switch on the tar but this will do for now - if it works)

If I have to use a script file is there any chance of a pointer as to how to write it?

Thanks for any help

colucix 03-28-2012 07:33 AM

Hi and welcome to LinuxQuestion!

A cron job runs as if it were launched from the user's home directory, so that you will find them under

or whatever be the home of the crontab owner. To avoid unexpected or unknown behaviour, just specify the absolute path of the archive. Regarding the subsequent rsync command, it should work with the pipe but there is no standard output from tar that needs to be passed as standard input to rsync. Instead, the logic should be: if the tar command did run successfully, sync the archive with the remote server. The shell has the && operator for doing this.

0 13 * * wed tar blah blah blah && rsync blah blah
Finally, to learn shell scripting there are some good resources on-line:Hope this helps.

lukester 03-28-2012 07:52 AM

Thanks very much for the prompt reply

Kind regards

colucix 03-28-2012 08:05 AM

You're welcome! :)

lukester 03-28-2012 09:23 AM

My solution
Just for anyone else who finds this forum post, this is what i have ended up with in crontab -e

0 23 * * sun tar -zPvcf /root/testbackupSUN.tar.gz /var/www/html && rsync -avz /root/testbackupSUN.tar.gz /mnt/win
0 23 * * mon tar -zPvcf /root/testbackupMON.tar.gz /var/www/html && rsync -avz /root/testbackupMON.tar.gz /mnt/win
0 23 * * tue tar -zPvcf /root/testbackupTUE.tar.gz /var/www/html && rsync -avz /root/testbackupTUE.tar.gz /mnt/win
0 23 * * wed tar -zPvcf /root/testbackupWED.tar.gz /var/www/html && rsync -avz /root/testbackupWED.tar.gz /mnt/win
0 23 * * thu tar -zPvcf /root/testbackupTHU.tar.gz /var/www/html && rsync -avz /root/testbackupTHU.tar.gz /mnt/win
0 23 * * fri tar -zPvcf /root/testbackupFRI.tar.gz /var/www/html && rsync -avz /root/testbackupFRI.tar.gz /mnt/win
0 23 * * sat tar -zPvcf /root/testbackupSAT.tar.gz /var/www/html && rsync -avz /root/testbackupSAT.tar.gz /mnt/win

This takes a backup of all files and folders at 11pm everynight from /var/www/html tars them and copies them to a mounted drive - which is a separate remote windows server. You have to add the mount to /etc/fstab for this to work. This enables me to have a full weeks backup of everyones website files.

My next task is to do exactly the same but omitting log files. If anyone knows how to do this by adding it into the crontab above please feel free to comment, im off to google it now though...

I have only run it once so will be interested to see if rsync overwrites last weeks file with the new one - im hoping it does!

Couldn't have done it without the help from colucix so thanks again....


colucix 03-28-2012 10:12 AM

tar has the --exclude option to exclude files based on a pattern. Suppose the files you want to exclude have the .log extension, you basically need to add

to your tar commands. Cheers!

jlinkels 03-28-2012 10:21 AM

Don't make it more complicated in a cron statement. I recommend to create a script file instead and call it from cron.

When calling the script file you can either pass the day as a command line parameter, or use the date command e.g. testbackup$(date +%a).tar.gz

Since you are using rsync anyway I advice not to use tar at all, except if you think you badly need the compression or you want to minimize the file size during transfer. Just rsync your files to a different directory every day. You can use the --exclude command for excluding files.

Since you are writing to a mounted drive, you should test the existence of the mounted drive. If not, sooner or later it will happen that the drive is not mounted and you are writing to the mount point of the local drive. Below the code how I solved this:




# Find if the device is mounted

df -h | grep $mount_point > /dev/null
if [ $? -eq 0 ]
        $echo_flag rsync -ua --exclude='/mnt' --exclude='/proc' --exclude='/sys' --exclude='/vmware' / /mnt/ext_daily > /var/log/rsync_daily
        echo "mount point $mount_point exists, rsync started"
        echo "Error: mount point $mount_point does not exist, rsync operation skipped"


lukester 03-28-2012 10:29 AM

Thanks for taking the time to help me jinkels, I am however just starting to learn linux - im a windows man myself - and scripts are just a tad too difficult for me to get my head around just yet!

I can see what you are saying about using a script file to enable a check to see if the mount point exists but for now at least I am just going to live in hope that the automount works as it should. Probably not the best approach but time is money and i dont have much of either!

I have to use tar as i have one backup server which will hold all the website backups for 4000+ websites, I dont want millions of folders, by using the tar i can just select the individual .tar.gz file for the server in question, copy it back to where it needs to be, untar and replace the (required) files.

Thats the plan anyway!

chrism01 03-28-2012 06:53 PM

Scripting is basically putting the cmds you normally use into a file and calling that, but does make it more practical to eg generate the day on the fly, thus only requiring one cron job instead of 7 :)
Also, doing everything on one line rapidly becomes difficult to achieve and a pain to try and read or debug.

These may help

All times are GMT -5. The time now is 10:05 PM.