LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 05-18-2011, 11:46 AM   #1
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Rep: Reputation: 0
Backup Purge Script Help Please!!!


Hello, I have successfully got backups going everynight of my server to a 1TB lacie network drive. every night that it backs up it takes up about 87.2 GB of space on the drive, so I can keep about a weeks worth of backups on it. I am trying to figure out how i can configure this script to remove the first backup of the week once a week. the backups a tar.gz files. can anyone help please. Here is what my backupPurge.sh script looks like. the path is correct that is leading to my lacie drive /mnt/backup1

#/bin/bash
## Configure FOLDERS_TO_PURGE to point the script to the right directoreis and the WEEKS_TO_KEEP in order to tell it how far back to purge.
##
FOLDERS_TO_PURGE=('/mnt/backup1')
WEEKS_TO_KEEP=1
#
###########################
## Backup Purge Function ##
###########################

function purge {
FOLDER=$0
DAY=$(date +%u)
DAYS_BACK=$(($WEEKS_TO_KEEP*1))
DAYS_BACK=$(($DAYS_BACK+$DAY+1))

#echo "cd $FOLDER"
cd $FOLDER

#echo $DAYS_BACK
#echo "find *.gz -atime +$DAYS_BACK -exec rm -v '{}' \;"
find *.gz* -atime +$DAYS_BACK -exec rm -v '{}' \;
#find *.contents.txt -atime +$DAYS_BACK -exec rm -v '{}' \;
}

for FOLDER in ${FOLDERS_TO_PURGE[@]}
do
purge $FOLDER
done

###########################
## End of Purge Function ##
###########################

df -h



Thanks
 
Old 05-18-2011, 02:20 PM   #2
SL00b
Member
 
Registered: Feb 2011
Location: LA, US
Distribution: SLES
Posts: 375

Rep: Reputation: 112Reputation: 112
There seems to be a lot of convoluted and broken logic in calculating the $DAYS_BACK value, and I'm not sure why you've created a function here for what should be a one-step operation.

The -atime parameter accepts a number of 24-hour periods since the last access time, so if you want to delete the backup that's over 7 days old, set -atime to 7.

Here's a much simpler version that accomplishes what you're after:

Code:
#/bin/bash
## Configure FOLDERS_TO_PURGE to point the script to the right directory and the DAYS_TO_KEEP in order to tell it how far back to purge.
##
FOLDER_TO_PURGE=('/mnt/backup1')
DAYS_TO_KEEP=7
#
###########################
## Backup Purge Function ##
###########################
find $FOLDER_TO_PURGE/*.gz* -atime +$DAYS_TO_KEEP -exec rm -v '{}' \;

Last edited by SL00b; 05-19-2011 at 09:44 AM. Reason: Syntax error
 
1 members found this post helpful.
Old 05-18-2011, 03:38 PM   #3
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Ok hey thanks a million, that gives me alot better idea. so pretty much this version that you posted will keep 7 days of backups and then it will deleted day 8 or the oldest modified file? and in the script you posted do I need to change anything or just keep it how you have it?
 
Old 05-19-2011, 08:53 AM   #4
SL00b
Member
 
Registered: Feb 2011
Location: LA, US
Distribution: SLES
Posts: 375

Rep: Reputation: 112Reputation: 112
I can't say what modifications you may or may not need to make. You have to take what's here and apply it to your particular situation.

For instance... are there any subfolders to /mnt/backup1 you need to exclude? This script, as written, will eliminate every file down the tree.

Is -atime the right switch? That's access time... if you touch the file, you've reset the counter, which may or may not be the best idea. You might be better served with -mtime.

Also, I can't guarantee it'll keep 7 backups and delete the 8th, because it's somewhat dependent on what time the backup was taken and what time this script is executed. From the man page:

Code:
-atime n 
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime +1, a file has to have been accessed at least two days ago.
 
Old 05-19-2011, 09:39 AM   #5
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Ok. There is no folders in the /mnt/backup drive. this is the error i am getting when i run it.


find: invalid argument `-exec' to `-atime'




Thanks so much for you help man.
 
Old 05-19-2011, 09:45 AM   #6
SL00b
Member
 
Registered: Feb 2011
Location: LA, US
Distribution: SLES
Posts: 375

Rep: Reputation: 112Reputation: 112
I had an extra underscore in the $DAYS_TO_KEEP variable. I've fixed it in the sample script.
 
Old 05-19-2011, 09:52 AM   #7
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Ok that seem to of fixed that part of it. Now I get this.


find: `/mnt/backup1/*.gz*': No such file or directory
 
Old 05-19-2011, 09:59 AM   #8
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Ok nevermind, I had to change to file extension to .tgz instead of .gz and it seems to run now. but hasn't done anything so i guess i will just have to schedule a crontab and wait to see what happens on the 7th day right?
 
Old 05-19-2011, 10:00 AM   #9
SL00b
Member
 
Registered: Feb 2011
Location: LA, US
Distribution: SLES
Posts: 375

Rep: Reputation: 112Reputation: 112
Simulpost.

Last edited by SL00b; 05-19-2011 at 10:02 AM. Reason: nevermind
 
Old 05-31-2011, 09:12 AM   #10
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Hey Sl00b,

The script you helped me out with is working great!! I have full 7 days of backups. I was wondering if there is anyway you could help me change the script up just a bit, I was thinking now that I have 7 days of full backups if there is anyway I could just do like incremental backups to those full days now instead of doing full ones every night, just writing to the backups the files that haved changed. is there anyway we could change that up a bit. Your help would be greatly appreciated thanks!


-Brady
 
Old 05-31-2011, 09:43 AM   #11
SL00b
Member
 
Registered: Feb 2011
Location: LA, US
Distribution: SLES
Posts: 375

Rep: Reputation: 112Reputation: 112
To do an incremental backup, you can use the following switch on your tar command:

Quote:
-N, --after-date DATE, --newer DATE
only store files newer than DATE
 
Old 05-31-2011, 09:53 AM   #12
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Original Poster
Rep: Reputation: 0
#/bin/bash
## Configure FOLDERS_TO_PURGE to point the script to the right directory and the DAYS_TO_KEEP in order to tell it how far back to purge.
##
FOLDER_TO_PURGE=('/mnt/backup1')
DAYS_TO_KEEP=7
###########################
## Backup Purge Function ##
###########################
find $FOLDER_TO_PURGE/*.tgz* -N, --after-date DATE, --newer DATE -exec rm -v {} \;



Does that look correct??


Thanks
 
Old 05-31-2011, 10:06 AM   #13
SL00b
Member
 
Registered: Feb 2011
Location: LA, US
Distribution: SLES
Posts: 375

Rep: Reputation: 112Reputation: 112
No. -N, --after-date, and --newer are all different switches meaning the same thing. You don't use all three... just pick one.

And you have to replace DATE with the date of your last full backup, or last incremental, depending on how you want to handle your incrementals.

See the tar man page.
http://linux.die.net/man/1/tar
 
Old 05-31-2011, 11:10 AM   #14
nooby210
LQ Newbie
 
Registered: May 2011
Posts: 8

Original Poster
Rep: Reputation: 0
Ok thanks. I see what you mean, I'm just not sure where I put that in my script. I am so new to this. I need to kind of be baby stepped. I want it to just do incremental backups everyday. these backups run Monday through Sunday and I already have full backups i Just want to be able to write to those files that are already there whenever files change. Instead of doing full backup every day. it takes about 8 hours just to run it, it would be great if I could make it alot faster. sorry for being a pain in the neck.
 
Old 05-31-2011, 11:20 AM   #15
SL00b
Member
 
Registered: Feb 2011
Location: LA, US
Distribution: SLES
Posts: 375

Rep: Reputation: 112Reputation: 112
Also, you were adding those switches to your find command, when your backup is being performed with the tar command. It's a whole other script.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] Using bash script to purge rotating tcpdump files merixon Programming 5 10-20-2010 11:19 AM
Newbie trying to write a simple backup script to backup a single folder Nd for school stryker759a Linux - Newbie 2 09-16-2009 08:52 AM
Script: purge files more than N days old. Need Non Recursive Find explore.s AIX 3 12-04-2008 07:24 AM
how to create backup MYSQL Script to backup my database for every 1hour RMLinux Linux - Newbie 3 11-20-2008 10:13 AM
Automatically Purge Backup Files Off Thumb Drive JustinK101 Linux - Software 4 04-23-2007 05:46 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 10:53 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration