LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 02-20-2012, 11:29 AM   #1
z01krh
Member
 
Registered: May 2009
Posts: 34

Rep: Reputation: 0
Incremental backup script


I am trying to write a script to loop through a set of folders that gerts any files modified in last 24 hours. This way my daily backups are smaller. The issue I have is that the loop takes 5 hours to run. Anyone have a faster way to do this. Note I only want to loop through certain folders not the whole drive.


<code>
BACKUP_DIRS="/etc /boot /root /home /var/www /var/lib/mysql"


for DIR in ${BACKUP_DIRS}
do

echo -e "\t$(date "+ %T") backing up ${DIR}" >> ${LOG_FILE}

# The frist sed removes the leading "/" from the dir the second sed replace any remaining "/" with a "."
# so instead of /var/www as a file name we get var.www.tar
TARBALL_FILE_NAME=`echo ${DIR} | sed 's/.\(.*\)/\1/' | sed -e 's/\//./g'`
find ${DIR} -daystart -ctime 0 -type f -exec tar -czpf ${BACKUP_FOLDER}/${TARBALL_FILE_NAME}.tar.gz ${DIR} {} \;
sleep 1

echo -e "\t$(date "+ %T") DONE backing up ${DIR}" >> ${LOG_FILE}

done

echo -e "\t$(date "+ %T") COMPRESSING BACKUP FILE" >> ${LOG_FILE}

<\code>

Last edited by z01krh; 02-20-2012 at 11:32 AM.
 
Old 02-20-2012, 11:45 AM   #2
uhelp
Member
 
Registered: Nov 2011
Location: Germany, Bavaria, Nueremberg area
Distribution: openSUSE, Debian, LFS
Posts: 205

Rep: Reputation: 43
This is the wrong usage of find.

-ctime does NOT display modified files. It displays the change of the meta data of the file. i.e. change of rights etc.

use

-mtime or -mmin for that.

-mtime n means n days. So -mtime 1 would do it

You can try to get it faster, by not letting "find" do the job, as it calls for each file "tar".

Try this
Code:
files=(  $( find $Dir  -mtime 1 )  )
tar -czf your.tar.gz ${files[*]}
And the code tags work like [ C O D E ] [ / C O D E ]
without the spaces.

Last edited by uhelp; 02-20-2012 at 11:46 AM.
 
Old 02-20-2012, 11:46 AM   #3
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,064
Blog Entries: 14

Rep: Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248
GNU tar (the one with Linux) has options for incremental backups (see -g for example). You might want to type "man tar" to get more details.

Also if it is ext2/ext3 you might want to look at the dump/restore commands instead of tar for backup. (Type "man dump" or "man restore" for details on those.
 
Old 02-20-2012, 12:15 PM   #4
z01krh
Member
 
Registered: May 2009
Posts: 34

Original Poster
Rep: Reputation: 0
UHelp thanks alot. That was exactly what I need. It speeded up the script alot.
MensaWater I actually do a full dump of the system once a week. The problem I have is space. The full dump is stored on a usb drive.
While the daily dumps are archived on a FTP server.
 
Old 02-20-2012, 01:02 PM   #5
lithos
Senior Member
 
Registered: Jan 2010
Location: SI : 45.9531, 15.4894
Distribution: CentOS, OpenNA/Trustix, testing desktop openSuse 12.1 /Cinnamon/KDE4.8
Posts: 1,144

Rep: Reputation: 217Reputation: 217Reputation: 217
Hi,

I wrote an example of TAR incremental backup
hope it helps.
 
Old 02-20-2012, 01:22 PM   #6
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,064
Blog Entries: 14

Rep: Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248Reputation: 1248
Quote:
Originally Posted by z01krh View Post
UHelp thanks alot. That was exactly what I need. It speeded up the script alot.
MensaWater I actually do a full dump of the system once a week. The problem I have is space. The full dump is stored on a usb drive.
While the daily dumps are archived on a FTP server.
Dump allows for incremental backups.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Help With Incremental Backup gaspower Programming 1 09-21-2011 04:03 PM
Incremental backup script question drsprite Linux - Software 2 05-19-2011 02:08 PM
[SOLVED] Incremental Backup me_spearhead Programming 1 01-09-2009 07:09 AM
Incremental backup script - Tar problem joshheald Linux - Software 1 09-28-2004 11:02 AM
Incremental Backup shell script. datadriven Linux - General 3 06-02-2004 10:19 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 09:24 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration