LinuxQuestions.org
Latest LQ Deal: Complete CCNA, CCNP & Red Hat Certification Training Bundle
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 03-16-2012, 02:44 AM   #1
ASTRAPI
Member
 
Registered: Feb 2007
Posts: 210

Rep: Reputation: 16
Question Cronjob backup commands on Centos 64Bit 6.2


Hello

On my dedicated web server (Centos 64Bit) i was try to add two cronjobs to backup my files and database on my /home/ folder using this:


Code:
* 5 * * * cd /home; mysqldump -u root -h localhost -pmypasshere dbname | gzip > mydatabase-$(date +%d-%m-%Y).sql.gzip >/dev/null 2>&1
Code:
20 5 * * * cd /home; tar -zcf backup_files.tgz /home/nginx/domains/mydomain.com/public/ >/dev/null 2>&1
But it doesn't work and is not creating any files on /home/

When i use the commands manually all working great !

Any ideas?

I also test cronjob to verify that is working and can drop files to home using this:

Code:
* * * * * /bin/echo "foobar" >> /home/testfile.txt
And after a minute i got on /home/ folder a new file there named testfile.txt

So cronjobs working...

Thank you

Last edited by ASTRAPI; 03-16-2012 at 02:46 AM.
 
Old 03-16-2012, 06:44 AM   #2
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976
The problem in the first job could be the presence of the % signs in the date format. The % sign has a special meaning in crontab and you need to escape it to guarantee the expected behaviour. From man 5 crontab:
Quote:
Percent-signs (%) in the command, unless escaped with backslash (\),
will be changed into newline characters, and all data after the first %
will be sent to the command as standard input.
Not sure about the second job, but you can start by removing the redirection to /dev/null in order to see if the job sends some error messages to your system mailbox (this is the default behaviour). An aside note: are you aware that in the first job the time specification
Code:
* 5 * * *
means every minute from 5:00 to 5:59? Is this what do you want?
 
Old 03-16-2012, 10:38 PM   #3
ASTRAPI
Member
 
Registered: Feb 2007
Posts: 210

Original Poster
Rep: Reputation: 16
Hello

Thanks for your help

How can i adjust the first command to be ok with % ?

I just want to run the first command at 5AM and the second 20 minutes later ...

Thank you
 
Old 03-16-2012, 11:43 PM   #4
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,258

Rep: Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947
Quote:
Originally Posted by ASTRAPI View Post
I just want to run the first command at 5AM and the second 20 minutes later ...
Change your first command to
Code:
0 5 * * *
Quote:
Originally Posted by ASTRAPI View Post
How can i adjust the first command to be ok with % ?
You should be able to delimit your % with a backslash, IE

Code:
* 5 * * * cd /home; mysqldump -u root -h localhost -pmypasshere dbname | gzip > mydatabase-$(date +\%d-\%m-\%Y).sql.gzip >/dev/null 2>&1
Keep in mind that this is untested though
 
Old 03-17-2012, 12:04 AM   #5
ASTRAPI
Member
 
Registered: Feb 2007
Posts: 210

Original Poster
Rep: Reputation: 16
Ok so i must try this:

Code:
0 5 * * * cd /home; mysqldump -u root -h localhost -pmypasshere dbname | gzip > mydatabase-$(date +\%d-\%m-\%Y).sql.gzip >/dev/null 2>&1
And how can i adjust this to run 20 minutes after 5AM?

Code:
20 5 * * * cd /home; tar -zcf backup_files.tgz /home/nginx/domains/mydomain.com/public/ >/dev/null 2>&1
Is it also possible to add there the date +\%d-\%m-\%Y ?

Thank you
 
Old 03-17-2012, 02:19 AM   #6
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,258

Rep: Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947Reputation: 1947
Quote:
Originally Posted by ASTRAPI View Post
Ok so i must try this:

Code:
0 5 * * * cd /home; mysqldump -u root -h localhost -pmypasshere dbname | gzip > mydatabase-$(date +\%d-\%m-\%Y).sql.gzip >/dev/null 2>&1
And how can i adjust this to run 20 minutes after 5AM?

Code:
20 5 * * * cd /home; tar -zcf backup_files.tgz /home/nginx/domains/mydomain.com/public/ >/dev/null 2>&1
It already will

Quote:
Originally Posted by ASTRAPI View Post
Is it also possible to add there the date +\%d-\%m-\%Y ?
I'm not sure what you mean, like:
Code:
20 5 * * * cd /home; tar -zcf backup_files_$(date +\%d-\%m-\%Y).tgz /home/nginx/domains/mydomain.com/public/ >/dev/null 2>&1
?
 
Old 03-18-2012, 09:48 AM   #7
ASTRAPI
Member
 
Registered: Feb 2007
Posts: 210

Original Poster
Rep: Reputation: 16
Ok i just try this:

Code:
0 5 * * * cd /home; mysqldump -u root -h localhost -pmypasshere dbname | gzip > mydatabase-$(date +\%d-\%m-\%Y).sql.gzip >/dev/null 2>&1
and this:

Code:
20 5 * * * cd /home; tar -zcf backup_files_$(date +\%d-\%m-\%Y).tgz /home/nginx/domains/mydomain.com/public/ >/dev/null 2>&1
And i got there the two files but with 0 bytes size

Any other ideas?
 
Old 03-18-2012, 09:56 AM   #8
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976Reputation: 1976
While testing I'd remove the redirection of both standard output and standard error to the linux black hole. Redirect them to a file or don't redirect them at all and check your mailbox later (since it is the place where the cron daemon sends stdout and stderr by default). Maybe you will get a clue which helps you to find the answer to the problem.

Regarding the first job: is the mysqldump command in the crontab PATH (/bin:/usr/bin by default)? Take in mind that crontab has a very limited environment and it's always a good habit to specify the absolute path of the commands.

Last edited by colucix; 03-18-2012 at 09:58 AM.
 
Old 03-18-2012, 11:23 AM   #9
ASTRAPI
Member
 
Registered: Feb 2007
Posts: 210

Original Poster
Rep: Reputation: 16
When i run them manually both commands work great so i think all are in the correct place

Do you mean to run them like:

Code:
0 5 * * * cd /home; mysqldump -u root -h localhost -pmypasshere dbname | gzip > mydatabase-$(date +\%d-\%m-\%Y).sql.gzip
and
Code:
20 5 * * * cd /home; tar -zcf backup_files_$(date +\%d-\%m-\%Y).tgz /home/nginx/domains/mydomain.com/public/
?
 
Old 03-18-2012, 11:31 AM   #10
repo
LQ 5k Club
 
Registered: May 2001
Location: Belgium
Distribution: Arch
Posts: 8,527

Rep: Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898
Cron uses a limited PATH.
Use the whole path to all commands and files.
And as colucix said:
Quote:
remove the redirection of both standard output and standard error to the linux black hole.
Kind regards
 
Old 03-18-2012, 12:00 PM   #11
ASTRAPI
Member
 
Registered: Feb 2007
Posts: 210

Original Poster
Rep: Reputation: 16
Can yo uplease adjust my commands ?

I am using already full paths...
 
Old 03-18-2012, 12:05 PM   #12
repo
LQ 5k Club
 
Registered: May 2001
Location: Belgium
Distribution: Arch
Posts: 8,527

Rep: Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898
use the
Code:
whereis
command to find the path
Code:
whereis mysqldump
etc

Kind regards
 
Old 03-18-2012, 12:19 PM   #13
ASTRAPI
Member
 
Registered: Feb 2007
Posts: 210

Original Poster
Rep: Reputation: 16
This is what i got:

Code:
# whereis mysqldump
mysqldump: /usr/bin/mysqldump /usr/share/man/man1/mysqldump.1.gz
Now how can i use this?
 
Old 03-18-2012, 12:23 PM   #14
repo
LQ 5k Club
 
Registered: May 2001
Location: Belgium
Distribution: Arch
Posts: 8,527

Rep: Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898Reputation: 898
Use
Code:
/usr/bin/mysqldump
in your command.
Idem for the other commands.
Did you remove the redirection to see the errors?

Kind regards

Last edited by repo; 03-18-2012 at 12:32 PM.
 
Old 03-18-2012, 01:06 PM   #15
ac_kumar
Member
 
Registered: Aug 2011
Distribution: Ubuntu, Fedora
Posts: 175

Rep: Reputation: 9
Quote:
Originally Posted by ASTRAPI View Post
When i run them manually both commands work great so i think all are in the correct place

Do you mean to run them like:

Code:
0 5 * * * cd /home; mysqldump -u root -h localhost -pmypasshere dbname | gzip > mydatabase-$(date +\%d-\%m-\%Y).sql.gzip
and
Code:
20 5 * * * cd /home; tar -zcf backup_files_$(date +\%d-\%m-\%Y).tgz /home/nginx/domains/mydomain.com/public/
?
I am new to cronjob I have tried the cronjob as:-
$crontab -e
0 21 * * * /path/to/script

and i have placed the code in the script and made it executable.
It worked for me.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
CentOS 5.4 64bit php-gd issue Nicarlo Linux - Newbie 4 03-11-2012 03:53 PM
Install mplayer on Centos 6.0 64bit? rewards Linux - Software 13 01-04-2012 03:57 AM
libgaim.so.0()(64bit) error | centos Cheza Linux - General 1 01-12-2008 11:32 AM
Wireshark Install On CentOS 64bit krams Linux - Newbie 3 04-24-2007 05:35 PM
What is the best way to do a scripted, cronjob, backup? Mikey_GoEagles Linux - General 3 12-02-2005 10:35 AM


All times are GMT -5. The time now is 10:44 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration