Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place! |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
05-07-2012, 09:14 AM
|
#1
|
LQ Newbie
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9
Rep: 
|
help with linux server backup bash script.
hi all im new to the form. recently i set up two fedora core 15 systems running Apache, Mysql and PHP. these servers are hosting a Moodle education site and student websites from there home folders i have one setup as a master and one as a backup server.
i have the websites working but im having issues trying to backups the folder locations. when i run my backup.sh script which contains
#! /bin/bash
echo Backup Started `date` >> /var/log/backuplog
/etc/init.d/httpd stop
mysqldump --all-databases -u myusername --password=mypass > /media/backups1/websites/`date +%Y%m%d`alldatabase.sql
cp -r -f /home/ /media/backups1/home/backups/`date +%Y%m%d`home/
cp -r -f /var/www/moodledata /media/backups1/websites/`date +%Y%m%d`moodledata/
cp -r -f /var/www/html2 /media/backups1/websites/`date +%Y%m%d`html2/
/etc/init.d/mysqld restart
/etc/init.d/httpd start
echo Backup Completed `date` >> /var/log/backuplog
when i check the backups the home folder backup it comes out bigger than the original home/ location my current home folder size is around 5gb right now when it goes to my backup location its about double around 9GB. when i first setup my backups.sh file i had it set to do a tar.gz backup instead of cp command but it would in turn start to many tar and gzip processes and would bog down my server
so my question is am i missing something in my cp commands as to way its doubling the folder size or is there a better way to do a tar.gz backup
Thanks Kevin.
|
|
|
05-07-2012, 09:29 AM
|
#2
|
LQ Newbie
Registered: Apr 2012
Location: New York
Distribution: Centos,Debian
Posts: 29
Rep:
|
This is how I back up all my websites plus MySql databases
#!/bin/bash
# bash script to backup website
BACKUP_LOG=/home/mike/mysql/website.log
date +"%Y-%m-%d %X" > $BACKUP_LOG
# bash script to backup mysql
MYSQL_USER=
MYSQL_PASSWORD=
MYSQL_BACKUP_DIR=/home/mike/mysql
# backup musql databases
/usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --all-databases --lock-all-tables --flush-logs --master-data=2 | bzip2 -c > $MYSQL_BACKUP_DIR/all-$(date -I).sql.bz2
# backup file system
rsync -avh --progress --delete /etc/postfix /srv/www /home/mike/mysql kyrunner@192.168.1.10:/Users/kyrunner/k
# remove old MySQL database backups
find $MYSQL_BACKUP_DIR -maxdepth 1 -type f -name *.sql.bz2 -mtime +30 -exec rm -Rf {} \;
date +"%Y-%m-%d %X" > $BACKUP_LOG
# send email
mailx -s "Micro: website log" gmail.com < $BACKUP_LOG
|
|
|
05-07-2012, 09:43 AM
|
#3
|
Member
Registered: Aug 2011
Distribution: Ubuntu, Fedora
Posts: 175
Rep:
|
have you matched date wise folder for eg home compared to homedate1
home compared to homedate2
|
|
|
05-07-2012, 09:53 AM
|
#4
|
LQ Newbie
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9
Original Poster
Rep: 
|
i just did one from last week was showing about 3.1 GB and the one from this morning is 8.9G 20120507home/ compared to 4.5G home/ main location the small increase to 4.5 is ok there where some websites updated but the 8.9G backup from today is what has me stumped. looks like some of my other locations are doing the same thing my current moodle Data location is showing 1.5G /var/www/moodledata/ compared to the backup folder from this morning 2.9G 20120507moodledata/
|
|
|
05-07-2012, 11:58 AM
|
#5
|
Member
Registered: Apr 2012
Location: /root
Distribution: Ubuntu, Redhat, Fedora, CentOS
Posts: 190
Rep:
|
are you checking original size from du or df ? can you paste the df output too ? also another thing, you are taking mysql backup without stopping it than what it the use of restart mysql service. ?
|
|
|
05-07-2012, 12:16 PM
|
#6
|
LQ Newbie
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9
Original Poster
Rep: 
|
I'm not sure what i was thinking when i set mysql to restart just figured since i restart Apache i would also restart mysql.
as for the du -h home/ out put i get 4.5G home/
and for the backup i get 8.9G 20120507home/
for my moodle site its data folder is 1.5G moodledata/ backup is 2.9G 20120507moodledata/ and web folder is 1.3G html2/ backup is 2.5G 20120507html2/
|
|
|
05-08-2012, 05:47 AM
|
#7
|
Member
Registered: Apr 2012
Location: /root
Distribution: Ubuntu, Redhat, Fedora, CentOS
Posts: 190
Rep:
|
I would suggest to you to "-v" switch, I means change you copy command to show verbose output while copying all the files. and than check on the screen that what files/directories it is actually copying. that can helps you to figure out that what other things filling up the space.
another thing which i high suspect for the increasing space while copying that is copying linked files.
as far as i know that copying with "-r", that is used to copying files recursively will also copy soft links. and du only calculate link size that are in few KBs, but not calculate actual file size.
so that could be the making it larger.
Thanks,
em31amit
Amit M.
|
|
|
05-08-2012, 05:50 AM
|
#8
|
Member
Registered: Apr 2012
Location: /root
Distribution: Ubuntu, Redhat, Fedora, CentOS
Posts: 190
Rep:
|
so technically while copying with "cp -r -f", you are also copying all the actual files not those symbolic links.
|
|
|
05-08-2012, 09:18 AM
|
#9
|
LQ Newbie
Registered: May 2012
Location: Colorado
Distribution: Centos 6
Posts: 9
Original Poster
Rep: 
|
em31amit thanks for your replay i did cp -vrf and i was able to see the folder size drop to 4.5G which is correct now i swapped my script to do cp -rf and all the folders backing up have the correct size now unless there is a better why ill be using cp -rf. Is there also away to copy the folder permissions with cp?
|
|
|
05-08-2012, 07:26 PM
|
#10
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.x
Posts: 18,434
|
|
|
|
All times are GMT -5. The time now is 10:40 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|