[SOLVED] tar.gz backup script for mysql directory times out if run from cron
Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
tar.gz backup script for mysql directory times out if run from cron
Hello everyone,
I’m having trouble with a mysql directory backup script timing out when started by cron. I can run the script from command line and it completes successfully, but when cron starts it, it ends long before the backup is complete. I have several servers running the same script successfully from cron. The directory I’m attempting to backup is 103GB, the successful tar.gz files have been 15GB. The tar.gz files I get from cron are ~600MB and are corrupt. When I ran the script from command line, I watched the resource use and there is a fairly steady ~90% CPU use, physical memory use is ~1.1GB, and virtual memory use is ~14GB. This is a RHEL 5.3 server running on VMware.
Here is the first part of the script:
___________________________________________________________
Code:
#!/bin/sh
cd /var/lib/mysql_01/dirname
tar -czvf /backups/dirname.servername.tar.gz .
cd /backups
Your CRON should complete successfully although it might have other issues,
like
- is it maybe set to run too often, like perhaps every 2 hours and in that time the first doesn't finish while the second already starts ?
You should use [CODE] tags around the scripts to preserve formatting, like:
Code:
#!/bin/sh
cd /var/lib/mysql_01/dirname
tar -czvf /backups/dirname.servername.tar.gz .
cd /backups
where I would change the line:
Code:
tar -czvf /backups/dirname.servername.tar.gz .
to
tar -czvf /backups/dirname.servername.tar.gz *
I don't know why, but I'm used to do it that way.
The second issue is FTP transfer
- what command do you execute to transfer file ?
I'm used to "LFTP", to make SURE that file is transferred correctly.
If you post your complete script you're using, someone could even give you more solution.
I'm actually running this script once daily. I know there are a few places I could make it more efficient (such as putting the FTP credentials before the loop), but this was developed in a hurry with a lot of Google assistance (I know I'm no hardcore script developer). I don't have any trouble from the FTP section, but I'm happy to share the code if it helps my problem or helps bail anyone else out. I'm hesitant to change the backup from “.” to “*” only because I know this works from command line testing and on the other servers I have this running on. I did re-edit my original post adding CODE and /CODE, thanks for that (I’m a newbie here and appreciate the feedback.
I also just changed the crontab entry to send output and errors to a log. If this helps resolve the issue, I’ll be sure to update the thread with the answer.
The full (now generic) script is below:
Code:
#!/bin/sh
#################################################################
#
# Script to tar practice DBs & send them to the FTP server
#
#################################################################
#################################################################
# Populate PRACTICE with practice codes and start loop
#################################################################
for PRACTICE in {practice name was here}
do
#################################################################
# Backup Section
#################################################################
# Backup the practice databases
#################################################################
cd /var/lib/mysql_01/$PRACTICE
tar -czvf /backups/mumoves/$PRACTICE.{server name was here}.tar.gz .
cd /backups/mumoves
#################################################################
# FTP Section
#################################################################
# Assign variable values
#################################################################
SERVER={ip address was here}
USER={user name was here}
PASSW={password was here}
FILETYPE=binary
#################################################################
# Log into FTP server
#################################################################
ftp -i -v -n $SERVER <<END_OF_SESSION
user $USER $PASSW
#################################################################
# Binary mode transfer
#################################################################
$FILETYPE
#################################################################
# Send backup files and exit
#################################################################
mput $PRACTICE.{server name was here}.tar.gz
bye
END_OF_SESSION
#################################################################
# Cleanup Section
#################################################################
# Cleanup backup files after moving to FTP server
#################################################################
# rm /backups/mumoves/$PRACTICE.{server name was here}.tar.gz
#################################################################
# End loop
#################################################################
done
#################################################################
# Practices to be added later
#################################################################
Your CRON should complete successfully although it might have other issues,
like
- is it maybe set to run too often, like perhaps every 2 hours and in that time the first doesn't finish while the second already starts ?
You should use [CODE] tags around the scripts to preserve formatting, like:
Code:
#!/bin/sh
cd /var/lib/mysql_01/dirname
tar -czvf /backups/dirname.servername.tar.gz .
cd /backups
where I would change the line:
Code:
tar -czvf /backups/dirname.servername.tar.gz .
to
tar -czvf /backups/dirname.servername.tar.gz *
I don't know why, but I'm used to do it that way.
The second issue is FTP transfer
- what command do you execute to transfer file ?
I'm used to "LFTP", to make SURE that file is transferred correctly.
If you post your complete script you're using, someone could even give you more solution.
good luck
tar -czvf /backups/dirname.servername.tar.gz * does not glob . files.
your script really doesn't do anything weird, but this is how I would do it (my backup script with TAR)
- check the TAR you're using
Code:
tar --version
tar (GNU tar) 1.22
Copyright (C) 2009 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>.
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Written by John Gilmore and Jay Fenlason.
see the GNU tar I have.
- get rid (delete) of LOOP and so much comment hashes ###
Code:
for PRACTICE in {practice name was here}
do
...
#################################################################
# End loop
#################################################################
done
replace ##################################################
with # ---------------
- rewrite the code for TAR
Code:
# Backup the practice databases
#################################################################
cd /var/lib/mysql_01/$PRACTICE
tar -czvf /backups/mumoves/$PRACTICE.{server name was here}.tar.gz .
cd /backups/mumoves
with:
DATABASE_NAME="database name to be archived"
PRACTICE=/var/lib/mysql_01/$DATABASE_NAME/
cd /backups/mumoves/
{ tar pczvRf /backups/mumoves/$DATABASE_NAME.{server name was here}.tar.gz $PRACTICE; }
# --- tar p - preserve UID of file
# --- tar R - include (recurse) subdirs
I don't have any problems with archiving this way (all files sum around 50GB).
Then I would run the script manually first and see if it works and then add it to CRON to "test".
I didn't see that CRON would not do the job/end it before finished, since it happened that my FTP server was down for few days and all the jobs were waiting 3 days to FTP transfer all the backups made.
good luck and good night for now (since it's afternoon at you and almost midnight where I am)
I expected adding logging to the crontab entry would help me troubleshoot the issue. I didn’t expect it to actually resolve the issue which appears to be the case. I logged in last night expecting to have a log, a failed tar.gz, and the need to run it manually (which always worked). I was surprised to find a 7+GB tar.gz file (it typically died around 600MB) that was still growing. There was also nothing in the log to indicate any type of problem at all (too bad, I would have liked to know what had been happening and have a solid feeling that it was resolved and how). Regardless, the only change was made to the crontab entry.
with this line all the output was logged into Cron Log and since you have a gazillion of files probably the Log stopped and also the script died (Log files are probably limited on your server too, like the other 'default' CentOs settings)
and it's always nice to see that something is solved :-)
Actually, the cron log only showed the start times of cron entries and wasn't that big. Whatever the reason, I'm happy its resolved and thankful for all the helpfull suggestions.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.