LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 02-09-2016, 05:10 PM   #1
islammanjurul
LQ Newbie
 
Registered: Dec 2015
Posts: 11

Rep: Reputation: Disabled
Cron task guide and help to backup a directory


Hi,

Apology first if it is not the right place to ask this question or any mistake on title too.

I have a DB server, runs PostgreSQL and periodically take backup of database using pg_dump. It takes dump every hour, so 24 dumps every day, scheduled on crontab. Everyday it runs the backup script, where it is configured that before taking a new dump, it will delete the old dump from previous day of a given hour.
Like before dumping today at 2AM, it will delete the 2AM folder of previous day. The backup folders are named like this "dump-%H" where %H is the hour, 00, 01, 02 till 23. So everyday it deletes the "dump-%H" directory from previous day, re-create it again as empty and dump the db. So I have dump of last 24 hours for a given time during any crisis.

Now a situation has arises - I need to have atleast one dump directory to stay for a month.
My plan is to keep any one dump of a given week at a safe side. For example: the dump generated on Monday 3AM will be moved to a different folder, renamed the 'dump-%H' directory to '%date-dump-%H' - the date of the Monday will be assigned.
In similar way, every monday, 3AM's dump will be kept on safe side. So for a given month, I will have 4 dumps readily available, as well as the 24 hours dump as I have configured previously.
On the next month, the first Monday's dump from previous month will get removed, and new Monday 3AM's dump will take it place, with new renamed to '%date-dump-%H'.

I want to automate the whole things using a script and running on crontab. Can anyone help here?

Also I am pasting my present backup script to give some idea on how my present 24 hours backup works.
Code:
#!/bin/bash

export PGPASSWORD=password
backup_date=`date +%H`
echo "Dumping postgresql database [$backup_date]..."
rm -Rf "/home/backup/pg/dump-$backup_date"
pg_dump -U w3matter -h localhost -d example -Fd -j3 -f "/home/backup/pg/dump-$backup_date"
As clearly stated on above script, it is doing rm -Rf on previous day's same hours' backup, then creating the new dump for present day's same hour's dumping.
 
Old 02-10-2016, 03:52 AM   #2
sag47
Senior Member
 
Registered: Sep 2009
Location: Orange County, CA
Distribution: Kubuntu x64, Raspbian, CentOS
Posts: 1,851
Blog Entries: 36

Rep: Reputation: 455Reputation: 455Reputation: 455Reputation: 455Reputation: 455
Before I go to bed check out "man 5 crontab" in the terminal.
 
Old 02-10-2016, 07:05 AM   #3
fmattheus
Member
 
Registered: Nov 2015
Posts: 104

Rep: Reputation: 38
Yes, you've got it. Do a cron entry that runs Mondays at 3 am. (3:01 would be better so you avoid problems during daylight savings shift) All it needs to do is copy the 2:00 file to a separate directory.
 
1 members found this post helpful.
Old 02-10-2016, 07:44 AM   #4
michaelk
Moderator
 
Registered: Aug 2002
Posts: 16,427

Rep: Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938
You could modify your original script with an if else conditional to save Mon 3 AM files to the date-dump-%H name. To remove the month old files you can use the find command and delete files older than 27 days.

You could also copy your original script that runs just on Mon 3AM and saves to the date-dump-%H name format. You might want it to start after the regular script completes if known. It will create two copies just on Mon. Again use the find command to delete files older than 27 days.
 
1 members found this post helpful.
Old 02-10-2016, 07:45 AM   #5
islammanjurul
LQ Newbie
 
Registered: Dec 2015
Posts: 11

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by fmattheus View Post
Yes, you've got it. Do a cron entry that runs Mondays at 3 am. (3:01 would be better so you avoid problems during daylight savings shift) All it needs to do is copy the 2:00 file to a separate directory.
I do understand about copying and I did it already, however how will I delete the oldest directory when total number of directory is 4 - as on a given month there are 4 Monday, when the next month start, the oldest directory will be removed. How will I determine the oldest directory? I want to have 4 Monday backup, not more, to save storage space.
 
Old 02-10-2016, 08:03 AM   #6
fmattheus
Member
 
Registered: Nov 2015
Posts: 104

Rep: Reputation: 38
There are many ways around that.

As michaelk said, you can run a find command before copying which deletes all files in a dir older than x hours
Code:
find dir -mtime +x
You can just rename each file before creating to new file like this
Code:
rm file.4
mv file.3 file.4
mv file.2 file.3
mv file.1 file.2
cp backupfile file.1
This is a hideous hack. I leave it to you to make a nice loop, I purposely wrote it simply so you'd understand the concept.
 
1 members found this post helpful.
Old 02-10-2016, 09:04 AM   #7
michaelk
Moderator
 
Registered: Aug 2002
Posts: 16,427

Rep: Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938
Untested but basically
Code:
...
day=$( date +%a ) # day of week
backup_date=$( date +%H ) # hour
mon_backup=$( date +%m%d )-dump-$backup_date # Mon file name or whatever format as desired
... 
if ( $day == "Mon" && $backup_date == "03" ); then
  find /path/to/Mon_files/* -mtime +27 -exec rm {} \; # delete last months file.
  pg_dump ... -f /path/to/Mon_files/$mon_backup
else
  .... # rm and pg_dump same as original script
fi
 
1 members found this post helpful.
Old 02-10-2016, 09:16 AM   #8
islammanjurul
LQ Newbie
 
Registered: Dec 2015
Posts: 11

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by fmattheus View Post
There are many ways around that.

As michaelk said, you can run a find command before copying which deletes all files in a dir older than x hours
Code:
find dir -mtime +x
You can just rename each file before creating to new file like this
Code:
rm file.4
mv file.3 file.4
mv file.2 file.3
mv file.1 file.2
cp backupfile file.1
This is a hideous hack. I leave it to you to make a nice loop, I purposely wrote it simply so you'd understand the concept.
This one looks beyond my head, can you please elaborate little about this trick, maybe PM if you cannot share here.
 
Old 02-10-2016, 09:17 AM   #9
islammanjurul
LQ Newbie
 
Registered: Dec 2015
Posts: 11

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by michaelk View Post
Untested but basically
Code:
...
day=$( date +%a ) # day of week
backup_date=$( date +%H ) # hour
mon_backup=$( date +%m%d )-dump-$backup_date # Mon file name or whatever format as desired
... 
if ( $day == "Mon" && $backup_date == "03" ); then
  find /path/to/Mon_files/* -mtime +27 -exec rm {} \; # delete last months file.
  pg_dump ... -f /path/to/Mon_files/$mon_backup
else
  .... # rm and pg_dump same as original script
fi
I'll test this and will repost here with the result, maybe some modifications will be needed, I'll do and will share the script if all goes well
 
Old 02-10-2016, 09:22 AM   #10
fmattheus
Member
 
Registered: Nov 2015
Posts: 104

Rep: Reputation: 38
Quote:
This one looks beyond my head, can you please elaborate little about this trick, maybe PM if you cannot share here.
Then I would leave it as is, as you need to be able to maintain/understand it.

Something like this, would be the loop, but I wouldn't recommend putting concepts into production that you don't understand. The advantage is you just change the value of files_to_keep if you need to keep more or less versions.
Code:
files_to_keep=4
rm file.$files_to_keep
for num in {$((files_to_keep-1))..1}
do
    mv file.$num file.$(($num + 1))
done

Last edited by fmattheus; 02-10-2016 at 09:25 AM.
 
1 members found this post helpful.
Old 02-10-2016, 09:38 AM   #11
michaelk
Moderator
 
Registered: Aug 2002
Posts: 16,427

Rep: Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938
This if statement should work better.

Code:
if [ $day == "Mon" ] && [ $backup_date == "03" ]; then
 ...
else
  ...
fi
I don't think the OP wants to rotate files just save a particular day/hour backup to another location. I will note that there will not be a normal 3 AM backup file.

Last edited by michaelk; 02-10-2016 at 09:42 AM.
 
1 members found this post helpful.
Old 02-10-2016, 10:01 AM   #12
fmattheus
Member
 
Registered: Nov 2015
Posts: 104

Rep: Reputation: 38
Quote:
Originally Posted by michaelk View Post
I don't think the OP wants to rotate files just save a particular day/hour backup to another location. I will note that there will not be a normal 3 AM backup file.
The user wants to keep a backup from the last 4 Mondays. This can be achieved by by deleting old files with find, or by rotating ...
 
1 members found this post helpful.
Old 02-10-2016, 10:50 AM   #13
islammanjurul
LQ Newbie
 
Registered: Dec 2015
Posts: 11

Original Poster
Rep: Reputation: Disabled
I made a little mistake understanding the requirement from my senior, here is the requirement

we will keep any particular hour's backup every day, from Monday to Sunday, lets say 3 AM backup directory will be moved. So in total we will have 7 backup directories renamed to date and day (like '10022016_Wed_example-03', where 10022016 is the date, Wed is the weekday, example is the directory name with -03 is the backup creation time in hour), next day, the newly created 3AM backup will be moved and renamed to '11022016_Thu_example-03', and so on for the seven days in a week. After taking the last backup on Sunday and moving it, I would like to keep a random or specific directory, say Friday's one, and remove/delete all others. Next week will have another 7 new directory with new date each plus the existing last week's random (Friday here) directory, so this time total directory is 8. Now want to remove any one of the present week's directory in random or specific, but not the one from last week (the Friday's one).

It will continue like this, and by the end of 5th week's Sunday, we will have total of 11 directory,
1. First week's one
2. Second week's one
3. Third week's one
4. Fourth week's one
5. Fifth week's seven (new month's first week) , so a total of 11 directory, and after selecting a random from fifth week (like previous week's method to be defined on script), we will have 5 directory, but want to automatically delete the oldest one now, that is the first week's directory. So again will have four directory. Again by the end of sixth week, the loop will continue, 11 directory, remove sixth week's any 6 plus remove the oldest one (second week directory)

So I hope I was able to clearly describe the requirement from my senior, but I am in a trouble on how to implement it properly. So many loops, if-else.

@michaelk and @fmattheus, please help a little. Just give some idea, I will be able to prepare the script then.
 
Old 02-11-2016, 07:49 AM   #14
michaelk
Moderator
 
Registered: Aug 2002
Posts: 16,427

Rep: Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938Reputation: 1938
You can delete everything except the desired backup based upon the file name each week. It gets a bit more complicated if you really want to keep a random file each week. That will require the use of the RANDOM function.

pseudo code
Code:
rm old hourly file
pg_dump new hourly file
if ( hour == "03" ); then
   mv hourly file /path/to/saved/weekly/files/
   if ( day_of_week == "Sun" ); then
      rm all except saved day
      mv save_day /path/to/month/files/
      delete files older than x days from /path/to/month/files
   fi
fi

Last edited by michaelk; 02-11-2016 at 07:52 AM.
 
1 members found this post helpful.
  


Reply

Tags
bash, cron, schedule, script


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] tar.gz backup script for mysql directory times out if run from cron GradyDiver Linux - General 7 12-09-2011 01:20 PM
cron - how to give priority to a task running in cron.daily? MeeLee Linux - Newbie 3 11-09-2010 09:41 AM
directory backup cron script goltoof Linux - Server 9 10-27-2009 04:40 PM
How to schedule *interactive* task with at or cron fast_rizwaan Linux - Newbie 4 09-09-2005 07:26 PM
[cron][mdk9.1]cron deamon seems to ignore some task... yannrichet Linux - Newbie 5 06-26-2003 10:57 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 05:59 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration