Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Distribution: GUI Ubuntu 14.0.4 - Server Ubuntu 14.04.5 LTS
Posts: 963
Rep:
backup script
Hi all
I'm trying to create a back up script to back up my httpd_conf file.
how ever when the cron runs it works ok but the file ends up been 350 meg in size.
can any one tell me where I went wrong please
My bash script
Code:
#!/bin/bash
# this file is an automated backup script, httpd_backup.sh.
# this backs up my httpd config file.
# cron is ( 45 * * * * /Downloads/scripts/httpd_backup.sh )
# what directory or file to back up
cd /etc/httpd/conf/httpd.conf
# path to back up folder
tar -zcf /Downloads/httpd_backup.sh.tar.gz .
And my Mutt mail reads
/Downloads/scripts/httpd_backup.sh: line 9: cd: /etc/httpd/conf/httpd_config: No such file or directory
TT
Last edited by tommytomato; 09-21-2006 at 07:59 AM.
cd /etc/httpd/conf/httpd.conf(it is a file, not folder) you can't 'cd' to here, so you will remain at current directory
tar -zcf /Downloads/httpd_backup.sh.tar.gz . since you not inside the /etc/httpd/conf folder but remain at home directory, it won't backup the file for you. What it do is backup the entire home directory for you(maybe root or your home directory)
so, you can change to:
#!/bin/bash
# this file is an automated backup script, httpd_backup.sh.
# this backs up my httpd config file.
# cron is ( 45 * * * * /Downloads/scripts/httpd_backup.sh )
# what directory or file to back up
cd /etc/httpd/conf
# path to back up folder
tar -zcf /Downloads/httpd_backup.sh.tar.gz .
Actually, it's a better practice to check if the "cd" works.
So, do something like:
Code:
#!/bin/bash
# this file is an automated backup script, httpd_backup.sh.
# this backs up my httpd config file.
# cron is ( 45 * * * * /Downloads/scripts/httpd_backup.sh )
dir="/etc/httpd/conf";
# what directory or file to back up
cd ${dir}
if [[ $? != 0 ]]; then
#cd failed, we'll exit with an error message
echo "Error: could not cd to ${dir}. No backup was created."
exit 1;
fi;
# path to back up folder
tar -zcf /Downloads/httpd_backup.sh.tar.gz .
In this case, the tar won't get executed if the "cd" fails.
Alternatively, you could just add the file(s) to be tar'ed at the end of the tar command. This means you don't have to do a "cd" at all.
Tar also has an option to specify a file that contains a list of files to be tar'red. Maybe useful for you too.
Change your script to something like this, which will timestamp the backup:
Code:
#!/bin/bash
#
# Place a date in the name of the backed up file
date=$(date +%F)
# Filename format + date of backup
filename=httpd_backup.$date.tar
# File to backup
file=/etc/httpd/conf/httpd.conf
# Backup directory to place file in
bakdir=/Downloads
cd $bakdir
tar zcf $filename $file
exit 0
My only question would be, does your httpd.conf file change daily? If it doesn't, why would you need to back it up daily?
Actually, it's a better practice to check if the "cd" works.
So, do something like:
Code:
#!/bin/bash
# this file is an automated backup script, httpd_backup.sh.
# this backs up my httpd config file.
# cron is ( 45 * * * * /Downloads/scripts/httpd_backup.sh )
dir="/etc/httpd/conf";
# what directory or file to back up
cd ${dir}
if [[ $? != 0 ]]; then
#cd failed, we'll exit with an error message
echo "Error: could not cd to ${dir}. No backup was created."
exit 1;
fi;
# path to back up folder
tar -zcf /Downloads/httpd_backup.sh.tar.gz .
In this case, the tar won't get executed if the "cd" fails.
Alternatively, you could just add the file(s) to be tar'ed at the end of the tar command. This means you don't have to do a "cd" at all.
Tar also has an option to specify a file that contains a list of files to be tar'red. Maybe useful for you too.
Actually, if your going to go as far as directory checks, might as well just check to make sure the file is there, since he only wants to backup the one file..
Distribution: GUI Ubuntu 14.0.4 - Server Ubuntu 14.04.5 LTS
Posts: 963
Original Poster
Rep:
Quote:
Originally Posted by trickykid
Change your script to something like this, which will timestamp the backup:
Code:
#!/bin/bash
#
# Place a date in the name of the backed up file
date=$(date +%F)
# Filename format + date of backup
filename=httpd_backup.$date.tar
# File to backup
file=/etc/httpd/conf/httpd.conf
# Backup directory to place file in
bakdir=/Downloads
cd $bakdir
tar zcf $filename $file
exit 0
My only question would be, does your httpd.conf file change daily? If it doesn't, why would you need to back it up daily?
cheers guys
so what your saying is there's no point in backing up one file
There is a point in backing up one file, if it's an important one, but only if that file changes regularly (ie daily, weekly,...).
If it doesn't, you can create just one backup copy (for recovery later on), and install some file integrity tools like samhain, aide or tripwire. They are designed to detect (and prevent) illegal changes to critical system files.
If it does change regularly, you should adapt your backup frequency to match the change frequency. There's no point in taking daily backups if the file only changes once a month, right?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.