Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hello, I have successfully got backups going everynight of my server to a 1TB lacie network drive. every night that it backs up it takes up about 87.2 GB of space on the drive, so I can keep about a weeks worth of backups on it. I am trying to figure out how i can configure this script to remove the first backup of the week once a week. the backups a tar.gz files. can anyone help please. Here is what my backupPurge.sh script looks like. the path is correct that is leading to my lacie drive /mnt/backup1
#/bin/bash
## Configure FOLDERS_TO_PURGE to point the script to the right directoreis and the WEEKS_TO_KEEP in order to tell it how far back to purge.
##
FOLDERS_TO_PURGE=('/mnt/backup1')
WEEKS_TO_KEEP=1
#
###########################
## Backup Purge Function ##
###########################
function purge {
FOLDER=$0
DAY=$(date +%u)
DAYS_BACK=$(($WEEKS_TO_KEEP*1))
DAYS_BACK=$(($DAYS_BACK+$DAY+1))
There seems to be a lot of convoluted and broken logic in calculating the $DAYS_BACK value, and I'm not sure why you've created a function here for what should be a one-step operation.
The -atime parameter accepts a number of 24-hour periods since the last access time, so if you want to delete the backup that's over 7 days old, set -atime to 7.
Here's a much simpler version that accomplishes what you're after:
Code:
#/bin/bash
## Configure FOLDERS_TO_PURGE to point the script to the right directory and the DAYS_TO_KEEP in order to tell it how far back to purge.
##
FOLDER_TO_PURGE=('/mnt/backup1')
DAYS_TO_KEEP=7
#
###########################
## Backup Purge Function ##
###########################
find $FOLDER_TO_PURGE/*.gz* -atime +$DAYS_TO_KEEP -exec rm -v '{}' \;
Last edited by SL00b; 05-19-2011 at 09:44 AM.
Reason: Syntax error
Ok hey thanks a million, that gives me alot better idea. so pretty much this version that you posted will keep 7 days of backups and then it will deleted day 8 or the oldest modified file? and in the script you posted do I need to change anything or just keep it how you have it?
I can't say what modifications you may or may not need to make. You have to take what's here and apply it to your particular situation.
For instance... are there any subfolders to /mnt/backup1 you need to exclude? This script, as written, will eliminate every file down the tree.
Is -atime the right switch? That's access time... if you touch the file, you've reset the counter, which may or may not be the best idea. You might be better served with -mtime.
Also, I can't guarantee it'll keep 7 backups and delete the 8th, because it's somewhat dependent on what time the backup was taken and what time this script is executed. From the man page:
Code:
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime +1, a file has to have been accessed at least two days ago.
Ok nevermind, I had to change to file extension to .tgz instead of .gz and it seems to run now. but hasn't done anything so i guess i will just have to schedule a crontab and wait to see what happens on the 7th day right?
The script you helped me out with is working great!! I have full 7 days of backups. I was wondering if there is anyway you could help me change the script up just a bit, I was thinking now that I have 7 days of full backups if there is anyway I could just do like incremental backups to those full days now instead of doing full ones every night, just writing to the backups the files that haved changed. is there anyway we could change that up a bit. Your help would be greatly appreciated thanks!
#/bin/bash
## Configure FOLDERS_TO_PURGE to point the script to the right directory and the DAYS_TO_KEEP in order to tell it how far back to purge.
##
FOLDER_TO_PURGE=('/mnt/backup1')
DAYS_TO_KEEP=7
###########################
## Backup Purge Function ##
###########################
find $FOLDER_TO_PURGE/*.tgz* -N, --after-date DATE, --newer DATE -exec rm -v {} \;
Ok thanks. I see what you mean, I'm just not sure where I put that in my script. I am so new to this. I need to kind of be baby stepped. I want it to just do incremental backups everyday. these backups run Monday through Sunday and I already have full backups i Just want to be able to write to those files that are already there whenever files change. Instead of doing full backup every day. it takes about 8 hours just to run it, it would be great if I could make it alot faster. sorry for being a pain in the neck.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.