[SOLVED] cron job - run a backup then rsync results to mounted drive
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I would like to pipe it instead of setting it as another cron job as i am not sure how long it will take to tar the /var/www/html folder as each server has a different number of sites.
Should this work or is there a better, easier, cleaner way to do this? (I am looking into doing an incremental backup using the update switch on the tar but this will do for now - if it works)
If I have to use a script file is there any chance of a pointer as to how to write it?
A cron job runs as if it were launched from the user's home directory, so that you will find them under
Code:
/home/luke
or whatever be the home of the crontab owner. To avoid unexpected or unknown behaviour, just specify the absolute path of the archive. Regarding the subsequent rsync command, it should work with the pipe but there is no standard output from tar that needs to be passed as standard input to rsync. Instead, the logic should be: if the tar command did run successfully, sync the archive with the remote server. The shell has the && operator for doing this.
Code:
0 13 * * wed tar blah blah blah &&rsync blah blah
Finally, to learn shell scripting there are some good resources on-line:
This takes a backup of all files and folders at 11pm everynight from /var/www/html tars them and copies them to a mounted drive - which is a separate remote windows server. You have to add the mount to /etc/fstab for this to work. This enables me to have a full weeks backup of everyones website files.
My next task is to do exactly the same but omitting log files. If anyone knows how to do this by adding it into the crontab above please feel free to comment, im off to google it now though...
I have only run it once so will be interested to see if rsync overwrites last weeks file with the new one - im hoping it does!
Couldn't have done it without the help from colucix so thanks again....
tar has the --exclude option to exclude files based on a pattern. Suppose the files you want to exclude have the .log extension, you basically need to add
Distribution: Debian /Jessie/Stretch/Sid, Linux Mint DE
Posts: 5,195
Rep:
Don't make it more complicated in a cron statement. I recommend to create a script file instead and call it from cron.
When calling the script file you can either pass the day as a command line parameter, or use the date command e.g. testbackup$(date +%a).tar.gz
Since you are using rsync anyway I advice not to use tar at all, except if you think you badly need the compression or you want to minimize the file size during transfer. Just rsync your files to a different directory every day. You can use the --exclude command for excluding files.
Since you are writing to a mounted drive, you should test the existence of the mounted drive. If not, sooner or later it will happen that the drive is not mounted and you are writing to the mount point of the local drive. Below the code how I solved this:
Code:
#!/bin/bash
mount_point='/mnt/ext_daily'
echo_flag=''
# Find if the device is mounted
df -h | grep $mount_point > /dev/null
if [ $? -eq 0 ]
then
$echo_flag rsync -ua --exclude='/mnt' --exclude='/proc' --exclude='/sys' --exclude='/vmware' / /mnt/ext_daily > /var/log/rsync_daily
echo "mount point $mount_point exists, rsync started"
else
echo "Error: mount point $mount_point does not exist, rsync operation skipped"
fi
Thanks for taking the time to help me jinkels, I am however just starting to learn linux - im a windows man myself - and scripts are just a tad too difficult for me to get my head around just yet!
I can see what you are saying about using a script file to enable a check to see if the mount point exists but for now at least I am just going to live in hope that the automount works as it should. Probably not the best approach but time is money and i dont have much of either!
I have to use tar as i have one backup server which will hold all the website backups for 4000+ websites, I dont want millions of folders, by using the tar i can just select the individual .tar.gz file for the server in question, copy it back to where it needs to be, untar and replace the (required) files.
Scripting is basically putting the cmds you normally use into a file and calling that, but does make it more practical to eg generate the day on the fly, thus only requiring one cron job instead of 7
Also, doing everything on one line rapidly becomes difficult to achieve and a pain to try and read or debug.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.