LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 05-07-2012, 09:14 AM   #1
krogerssolar
LQ Newbie
 
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9

Rep: Reputation: Disabled
Talking help with linux server backup bash script.


hi all im new to the form. recently i set up two fedora core 15 systems running Apache, Mysql and PHP. these servers are hosting a Moodle education site and student websites from there home folders i have one setup as a master and one as a backup server.

i have the websites working but im having issues trying to backups the folder locations. when i run my backup.sh script which contains

#! /bin/bash
echo Backup Started `date` >> /var/log/backuplog
/etc/init.d/httpd stop
mysqldump --all-databases -u myusername --password=mypass > /media/backups1/websites/`date +%Y%m%d`alldatabase.sql
cp -r -f /home/ /media/backups1/home/backups/`date +%Y%m%d`home/
cp -r -f /var/www/moodledata /media/backups1/websites/`date +%Y%m%d`moodledata/
cp -r -f /var/www/html2 /media/backups1/websites/`date +%Y%m%d`html2/
/etc/init.d/mysqld restart
/etc/init.d/httpd start
echo Backup Completed `date` >> /var/log/backuplog


when i check the backups the home folder backup it comes out bigger than the original home/ location my current home folder size is around 5gb right now when it goes to my backup location its about double around 9GB. when i first setup my backups.sh file i had it set to do a tar.gz backup instead of cp command but it would in turn start to many tar and gzip processes and would bog down my server

so my question is am i missing something in my cp commands as to way its doubling the folder size or is there a better way to do a tar.gz backup

Thanks Kevin.
 
Old 05-07-2012, 09:29 AM   #2
kyrunner
LQ Newbie
 
Registered: Apr 2012
Location: New York
Distribution: Centos,Debian
Posts: 29

Rep: Reputation: 1
This is how I back up all my websites plus MySql databases


#!/bin/bash
# bash script to backup website
BACKUP_LOG=/home/mike/mysql/website.log

date +"%Y-%m-%d %X" > $BACKUP_LOG

# bash script to backup mysql
MYSQL_USER=
MYSQL_PASSWORD=
MYSQL_BACKUP_DIR=/home/mike/mysql

# backup musql databases

/usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --all-databases --lock-all-tables --flush-logs --master-data=2 | bzip2 -c > $MYSQL_BACKUP_DIR/all-$(date -I).sql.bz2


# backup file system
rsync -avh --progress --delete /etc/postfix /srv/www /home/mike/mysql kyrunner@192.168.1.10:/Users/kyrunner/k

# remove old MySQL database backups
find $MYSQL_BACKUP_DIR -maxdepth 1 -type f -name *.sql.bz2 -mtime +30 -exec rm -Rf {} \;

date +"%Y-%m-%d %X" > $BACKUP_LOG

# send email
mailx -s "Micro: website log" gmail.com < $BACKUP_LOG
 
Old 05-07-2012, 09:43 AM   #3
ac_kumar
Member
 
Registered: Aug 2011
Distribution: Ubuntu, Fedora
Posts: 175

Rep: Reputation: 9
have you matched date wise folder for eg home compared to homedate1
home compared to homedate2
 
Old 05-07-2012, 09:53 AM   #4
krogerssolar
LQ Newbie
 
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9

Original Poster
Rep: Reputation: Disabled
i just did one from last week was showing about 3.1 GB and the one from this morning is 8.9G 20120507home/ compared to 4.5G home/ main location the small increase to 4.5 is ok there where some websites updated but the 8.9G backup from today is what has me stumped. looks like some of my other locations are doing the same thing my current moodle Data location is showing 1.5G /var/www/moodledata/ compared to the backup folder from this morning 2.9G 20120507moodledata/
 
Old 05-07-2012, 11:58 AM   #5
em31amit
Member
 
Registered: Apr 2012
Location: /root
Distribution: Ubuntu, Redhat, Fedora, CentOS
Posts: 190

Rep: Reputation: 55
are you checking original size from du or df ? can you paste the df output too ? also another thing, you are taking mysql backup without stopping it than what it the use of restart mysql service. ?
 
Old 05-07-2012, 12:16 PM   #6
krogerssolar
LQ Newbie
 
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9

Original Poster
Rep: Reputation: Disabled
I'm not sure what i was thinking when i set mysql to restart just figured since i restart Apache i would also restart mysql.

as for the du -h home/ out put i get 4.5G home/

and for the backup i get 8.9G 20120507home/

for my moodle site its data folder is 1.5G moodledata/ backup is 2.9G 20120507moodledata/ and web folder is 1.3G html2/ backup is 2.5G 20120507html2/
 
Old 05-08-2012, 05:47 AM   #7
em31amit
Member
 
Registered: Apr 2012
Location: /root
Distribution: Ubuntu, Redhat, Fedora, CentOS
Posts: 190

Rep: Reputation: 55
I would suggest to you to "-v" switch, I means change you copy command to show verbose output while copying all the files. and than check on the screen that what files/directories it is actually copying. that can helps you to figure out that what other things filling up the space.

another thing which i high suspect for the increasing space while copying that is copying linked files.
as far as i know that copying with "-r", that is used to copying files recursively will also copy soft links. and du only calculate link size that are in few KBs, but not calculate actual file size.

so that could be the making it larger.

Thanks,
em31amit
Amit M.
 
Old 05-08-2012, 05:50 AM   #8
em31amit
Member
 
Registered: Apr 2012
Location: /root
Distribution: Ubuntu, Redhat, Fedora, CentOS
Posts: 190

Rep: Reputation: 55
so technically while copying with "cp -r -f", you are also copying all the actual files not those symbolic links.
 
Old 05-08-2012, 09:18 AM   #9
krogerssolar
LQ Newbie
 
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9

Original Poster
Rep: Reputation: Disabled
em31amit thanks for your replay i did cp -vrf and i was able to see the folder size drop to 4.5G which is correct now i swapped my script to do cp -rf and all the folders backing up have the correct size now unless there is a better why ill be using cp -rf. Is there also away to copy the folder permissions with cp?
 
Old 05-08-2012, 07:26 PM   #10
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,359

Rep: Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751
See the -p switch http://linux.die.net/man/1/cp
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Need Bash script for backup tripbr Programming 3 01-26-2010 09:34 AM
Bash - backup script - need help with my if-then-else :D fruitwerks Programming 12 04-11-2009 01:34 PM
bash backup script kristof_v Programming 4 07-24-2007 07:33 PM
Backup script in bash gauge73 Programming 13 10-17-2005 06:25 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 03:37 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration