LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 03-28-2012, 07:05 AM   #1
lukester
LQ Newbie
 
Registered: Mar 2012
Posts: 11

Rep: Reputation: Disabled
cron job - run a backup then rsync results to mounted drive


Hi

Just looking for some advice, I have set up a cron job using crontab -e as root that looks as follows,
Code:
[root@backup win]# crontab -l
0 23 * * sun tar -zPvcf testbackupSUN.tar.gz /var/www/html
0 23 * * mon tar -zPvcf testbackupMON.tar.gz /var/www/html
0 23 * * tue tar -zPvcf testbackupTUE.tar.gz /var/www/html
0 13 * * wed tar -zPvcf testbackupWED.tar.gz /var/www/html
0 23 * * thu tar -zPvcf testbackupTHU.tar.gz /var/www/html
0 23 * * fri tar -zPvcf testbackupFRI.tar.gz /var/www/html
0 23 * * sat tar -zPvcf testbackupSUN.tar.gz /var/www/html
I am guessing this will work instead of using a path to a script file, question is where will the resulting testbackup<day>.tar.gz file end up?

I then need to rsync these .tar.gz files to a mounted drive, I can do this manually using
Code:
rsync -avz /root/sitebackup.tar.gz /mnt/win
but is there anyway to get this into the cron job - I was thinking using a pipe? So for instance
Code:
0 13 * * wed tar -zPvcf testbackupWED.tar.gz /var/www/html | rsync -avz /root/sitebackup.tar.gz /mnt/win
I would like to pipe it instead of setting it as another cron job as i am not sure how long it will take to tar the /var/www/html folder as each server has a different number of sites.

Should this work or is there a better, easier, cleaner way to do this? (I am looking into doing an incremental backup using the update switch on the tar but this will do for now - if it works)

If I have to use a script file is there any chance of a pointer as to how to write it?

Thanks for any help
Luke
 
Old 03-28-2012, 07:33 AM   #2
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983
Hi and welcome to LinuxQuestion!

A cron job runs as if it were launched from the user's home directory, so that you will find them under
Code:
/home/luke
or whatever be the home of the crontab owner. To avoid unexpected or unknown behaviour, just specify the absolute path of the archive. Regarding the subsequent rsync command, it should work with the pipe but there is no standard output from tar that needs to be passed as standard input to rsync. Instead, the logic should be: if the tar command did run successfully, sync the archive with the remote server. The shell has the && operator for doing this.
Code:
0 13 * * wed tar blah blah blah && rsync blah blah
Finally, to learn shell scripting there are some good resources on-line:Hope this helps.

Last edited by colucix; 03-28-2012 at 07:35 AM.
 
1 members found this post helpful.
Old 03-28-2012, 07:52 AM   #3
lukester
LQ Newbie
 
Registered: Mar 2012
Posts: 11

Original Poster
Rep: Reputation: Disabled
Thanks very much for the prompt reply

Kind regards
Luke
 
Old 03-28-2012, 08:05 AM   #4
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983
You're welcome!
 
Old 03-28-2012, 09:23 AM   #5
lukester
LQ Newbie
 
Registered: Mar 2012
Posts: 11

Original Poster
Rep: Reputation: Disabled
Smile My solution

Just for anyone else who finds this forum post, this is what i have ended up with in crontab -e
Code:
0 23 * * sun tar -zPvcf /root/testbackupSUN.tar.gz /var/www/html && rsync -avz /root/testbackupSUN.tar.gz /mnt/win
0 23 * * mon tar -zPvcf /root/testbackupMON.tar.gz /var/www/html && rsync -avz /root/testbackupMON.tar.gz /mnt/win
0 23 * * tue tar -zPvcf /root/testbackupTUE.tar.gz /var/www/html && rsync -avz /root/testbackupTUE.tar.gz /mnt/win
0 23 * * wed tar -zPvcf /root/testbackupWED.tar.gz /var/www/html && rsync -avz /root/testbackupWED.tar.gz /mnt/win
0 23 * * thu tar -zPvcf /root/testbackupTHU.tar.gz /var/www/html && rsync -avz /root/testbackupTHU.tar.gz /mnt/win
0 23 * * fri tar -zPvcf /root/testbackupFRI.tar.gz /var/www/html && rsync -avz /root/testbackupFRI.tar.gz /mnt/win
0 23 * * sat tar -zPvcf /root/testbackupSAT.tar.gz /var/www/html && rsync -avz /root/testbackupSAT.tar.gz /mnt/win
This takes a backup of all files and folders at 11pm everynight from /var/www/html tars them and copies them to a mounted drive - which is a separate remote windows server. You have to add the mount to /etc/fstab for this to work. This enables me to have a full weeks backup of everyones website files.

My next task is to do exactly the same but omitting log files. If anyone knows how to do this by adding it into the crontab above please feel free to comment, im off to google it now though...

I have only run it once so will be interested to see if rsync overwrites last weeks file with the new one - im hoping it does!

Couldn't have done it without the help from colucix so thanks again....

Regards
Luke
 
Old 03-28-2012, 10:12 AM   #6
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983
tar has the --exclude option to exclude files based on a pattern. Suppose the files you want to exclude have the .log extension, you basically need to add
Code:
--exclude='*.log'
to your tar commands. Cheers!
 
Old 03-28-2012, 10:21 AM   #7
jlinkels
LQ Guru
 
Registered: Oct 2003
Location: Bonaire, Leeuwarden
Distribution: Debian /Jessie/Stretch/Sid, Linux Mint DE
Posts: 5,195

Rep: Reputation: 1043Reputation: 1043Reputation: 1043Reputation: 1043Reputation: 1043Reputation: 1043Reputation: 1043Reputation: 1043
Don't make it more complicated in a cron statement. I recommend to create a script file instead and call it from cron.

When calling the script file you can either pass the day as a command line parameter, or use the date command e.g. testbackup$(date +%a).tar.gz

Since you are using rsync anyway I advice not to use tar at all, except if you think you badly need the compression or you want to minimize the file size during transfer. Just rsync your files to a different directory every day. You can use the --exclude command for excluding files.

Since you are writing to a mounted drive, you should test the existence of the mounted drive. If not, sooner or later it will happen that the drive is not mounted and you are writing to the mount point of the local drive. Below the code how I solved this:

Code:
#!/bin/bash

mount_point='/mnt/ext_daily'
echo_flag=''

# Find if the device is mounted

df -h | grep $mount_point > /dev/null
if [ $? -eq 0 ]
then
        $echo_flag rsync -ua --exclude='/mnt' --exclude='/proc' --exclude='/sys' --exclude='/vmware' / /mnt/ext_daily > /var/log/rsync_daily
        echo "mount point $mount_point exists, rsync started"
else
        echo "Error: mount point $mount_point does not exist, rsync operation skipped"
fi
jlinkels
 
Old 03-28-2012, 10:29 AM   #8
lukester
LQ Newbie
 
Registered: Mar 2012
Posts: 11

Original Poster
Rep: Reputation: Disabled
Thanks for taking the time to help me jinkels, I am however just starting to learn linux - im a windows man myself - and scripts are just a tad too difficult for me to get my head around just yet!

I can see what you are saying about using a script file to enable a check to see if the mount point exists but for now at least I am just going to live in hope that the automount works as it should. Probably not the best approach but time is money and i dont have much of either!

I have to use tar as i have one backup server which will hold all the website backups for 4000+ websites, I dont want millions of folders, by using the tar i can just select the individual .tar.gz file for the server in question, copy it back to where it needs to be, untar and replace the (required) files.

Thats the plan anyway!
 
Old 03-28-2012, 06:53 PM   #9
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,359

Rep: Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751
Scripting is basically putting the cmds you normally use into a file and calling that, but does make it more practical to eg generate the day on the fly, thus only requiring one cron job instead of 7
Also, doing everything on one line rapidly becomes difficult to achieve and a pain to try and read or debug.

These may help
http://rute.2038bug.com/index.html.gz
http://tldp.org/LDP/Bash-Beginners-G...tml/index.html
http://www.tldp.org/LDP/abs/html/
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
cron backup job fails to run bluethundr Linux - Newbie 5 01-06-2011 05:50 AM
how can i run rsync as a cron job using public keys authentication disorderly Linux - Server 14 03-13-2008 09:51 PM
Backup script won't run in cron job. Why? Micro420 Linux - General 19 10-31-2007 08:26 PM
Cannot execute rsync as a cron job jdaniels73 Linux - Software 2 09-03-2006 05:03 AM
cron job email results... ryedunn Linux - General 7 01-26-2005 10:27 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 04:02 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration