LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   help with linux server backup bash script. (https://www.linuxquestions.org/questions/linux-newbie-8/help-with-linux-server-backup-bash-script-943693/)

krogerssolar 05-07-2012 09:14 AM

help with linux server backup bash script.
 
hi all im new to the form. recently i set up two fedora core 15 systems running Apache, Mysql and PHP. these servers are hosting a Moodle education site and student websites from there home folders i have one setup as a master and one as a backup server.

i have the websites working but im having issues trying to backups the folder locations. when i run my backup.sh script which contains

#! /bin/bash
echo Backup Started `date` >> /var/log/backuplog
/etc/init.d/httpd stop
mysqldump --all-databases -u myusername --password=mypass > /media/backups1/websites/`date +%Y%m%d`alldatabase.sql
cp -r -f /home/ /media/backups1/home/backups/`date +%Y%m%d`home/
cp -r -f /var/www/moodledata /media/backups1/websites/`date +%Y%m%d`moodledata/
cp -r -f /var/www/html2 /media/backups1/websites/`date +%Y%m%d`html2/
/etc/init.d/mysqld restart
/etc/init.d/httpd start
echo Backup Completed `date` >> /var/log/backuplog


when i check the backups the home folder backup it comes out bigger than the original home/ location my current home folder size is around 5gb right now when it goes to my backup location its about double around 9GB. when i first setup my backups.sh file i had it set to do a tar.gz backup instead of cp command but it would in turn start to many tar and gzip processes and would bog down my server

so my question is am i missing something in my cp commands as to way its doubling the folder size or is there a better way to do a tar.gz backup

Thanks Kevin.

kyrunner 05-07-2012 09:29 AM

This is how I back up all my websites plus MySql databases


#!/bin/bash
# bash script to backup website
BACKUP_LOG=/home/mike/mysql/website.log

date +"%Y-%m-%d %X" > $BACKUP_LOG

# bash script to backup mysql
MYSQL_USER=
MYSQL_PASSWORD=
MYSQL_BACKUP_DIR=/home/mike/mysql

# backup musql databases

/usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --all-databases --lock-all-tables --flush-logs --master-data=2 | bzip2 -c > $MYSQL_BACKUP_DIR/all-$(date -I).sql.bz2


# backup file system
rsync -avh --progress --delete /etc/postfix /srv/www /home/mike/mysql kyrunner@192.168.1.10:/Users/kyrunner/k

# remove old MySQL database backups
find $MYSQL_BACKUP_DIR -maxdepth 1 -type f -name *.sql.bz2 -mtime +30 -exec rm -Rf {} \;

date +"%Y-%m-%d %X" > $BACKUP_LOG

# send email
mailx -s "Micro: website log" gmail.com < $BACKUP_LOG

ac_kumar 05-07-2012 09:43 AM

have you matched date wise folder for eg home compared to homedate1
home compared to homedate2

krogerssolar 05-07-2012 09:53 AM

i just did one from last week was showing about 3.1 GB and the one from this morning is 8.9G 20120507home/ compared to 4.5G home/ main location the small increase to 4.5 is ok there where some websites updated but the 8.9G backup from today is what has me stumped. looks like some of my other locations are doing the same thing my current moodle Data location is showing 1.5G /var/www/moodledata/ compared to the backup folder from this morning 2.9G 20120507moodledata/

em31amit 05-07-2012 11:58 AM

are you checking original size from du or df ? can you paste the df output too ? also another thing, you are taking mysql backup without stopping it than what it the use of restart mysql service. ?

krogerssolar 05-07-2012 12:16 PM

I'm not sure what i was thinking when i set mysql to restart just figured since i restart Apache i would also restart mysql.

as for the du -h home/ out put i get 4.5G home/

and for the backup i get 8.9G 20120507home/

for my moodle site its data folder is 1.5G moodledata/ backup is 2.9G 20120507moodledata/ and web folder is 1.3G html2/ backup is 2.5G 20120507html2/

em31amit 05-08-2012 05:47 AM

I would suggest to you to "-v" switch, I means change you copy command to show verbose output while copying all the files. and than check on the screen that what files/directories it is actually copying. that can helps you to figure out that what other things filling up the space.

another thing which i high suspect for the increasing space while copying that is copying linked files.
as far as i know that copying with "-r", that is used to copying files recursively will also copy soft links. and du only calculate link size that are in few KBs, but not calculate actual file size.

so that could be the making it larger.

Thanks,
em31amit
Amit M.

em31amit 05-08-2012 05:50 AM

so technically while copying with "cp -r -f", you are also copying all the actual files not those symbolic links.

krogerssolar 05-08-2012 09:18 AM

em31amit thanks for your replay i did cp -vrf and i was able to see the folder size drop to 4.5G which is correct now i swapped my script to do cp -rf and all the folders backing up have the correct size now unless there is a better why ill be using cp -rf. Is there also away to copy the folder permissions with cp?

chrism01 05-08-2012 07:26 PM

See the -p switch http://linux.die.net/man/1/cp


All times are GMT -5. The time now is 08:51 PM.