LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 09-29-2015, 07:08 PM   #1
FredGSanford
Senior Member
 
Registered: Nov 2005
Location: USA
Distribution: Mageia 7 - Debian 10 - Artix Linux
Posts: 1,142
Blog Entries: 5

Rep: Reputation: 207Reputation: 207Reputation: 207
Backup Script


Is this the correct format if I want a backup script to run weekly? I will put the script in the cron.weekly directory using rsync.

#Todays date:
WEEK0=`date -I`

#LastWeeks date:
WEEK1=`date -I -d "1 week ago"`

I will run the initial script myself and then want it to run weekly, there after.

Any suggestions or corrections appreciated.

Thanks.
 
Old 09-29-2015, 10:04 PM   #2
Beryllos
Member
 
Registered: Apr 2013
Location: Massachusetts
Distribution: Debian
Posts: 529

Rep: Reputation: 319Reputation: 319Reputation: 319Reputation: 319
I assume you need the variable WEEK1 in order to access the previous backup. That's fine as long as the backups run every week without fail, but what if something goes wrong and the backup misses a week? That could happen if, for example, the system was down, or the rsync destination was offline or full. You must design your script to work even if the last backup was not 7 days ago.

A more robust approach would be to identify the previous backup regardless of its date, and use it if its date is within an acceptable range. If your backup sets are in directories named according to $(date -I), which yields YYYY-MM-DD, it is simple to identify the most recent set:
Code:
last_backup=$(ls -d [0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] | tail -1)
You can determine if its date is acceptable by bash string comparison, for example,
Code:
if [[ $last_backup > $(date -I -d "3 week ago") ]]
then
    echo "last backup set is less than 3 weeks old"
else
    echo "last backup set is 3 weeks or more old, or non-existent"
fi
 
1 members found this post helpful.
Old 09-29-2015, 11:31 PM   #3
FredGSanford
Senior Member
 
Registered: Nov 2005
Location: USA
Distribution: Mageia 7 - Debian 10 - Artix Linux
Posts: 1,142

Original Poster
Blog Entries: 5

Rep: Reputation: 207Reputation: 207Reputation: 207
Beryllos, you are correct about it will not run every week, it may run for weeks and then down a while. Thanks for the input. I'll see about adjusting my script and post the full script when I'm at home.

Thnx. Again
 
Old 09-30-2015, 09:55 AM   #4
FredGSanford
Senior Member
 
Registered: Nov 2005
Location: USA
Distribution: Mageia 7 - Debian 10 - Artix Linux
Posts: 1,142

Original Poster
Blog Entries: 5

Rep: Reputation: 207Reputation: 207Reputation: 207
This is my complete backup script.

Code:
    #Todays date in ISO-8601 format:
    WEEK0=`date -I`
     
    #LastWeeks date in ISO-8601 format:
    WEEK1=`date -I -d "1 week ago"`
     
    #The source directory:
    SRC="/home/saptech/"
     
    #The target directory:
    TRG="/mnt/usb/Funtoo-Bkup/sappy-user/$WEEK0"
     
    #The link destination directory:
    LNK="/mnt/usb/Funtoo-Bkup/sappy-user/$WEEK1"
     
    #The rsync options:
    OPT="-avh --delete --link-dest=$LNK"
     
    #Execute the backup
    rsync $OPT $SRC $TRG
Any suggestions to make it better is appreciated.
 
Old 09-30-2015, 12:01 PM   #5
Beryllos
Member
 
Registered: Apr 2013
Location: Massachusetts
Distribution: Debian
Posts: 529

Rep: Reputation: 319Reputation: 319Reputation: 319Reputation: 319
Looks good, short and sweet.

Since the script creates a new destination directory every time, the --delete option does nothing and you could omit it. (If you were syncing an existing destination directory, --delete would delete files on the destination which had been deleted on the source.)

The -v option may result in a lengthy output, which I think cron will e-mail to you. If you prefer, you could redirect the output to a log file.

If you want to identify the most recent backup even if it is not from 1 week ago, you could cd to the destination parent directory and then use a command like the one I posted previously:
Code:
cd "/mnt/usb/Funtoo-Bkup/sappy-user"
LNK="$(pwd)"/"$(ls -d [0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9] | tail -1)"
In this example, the quotation marks could be omitted, but they are necessary when there are spaces in the directory names. Edit: ... but then also $OPT would have to be constructed differently to preserve quotation marks around $LNK.

Last edited by Beryllos; 09-30-2015 at 12:15 PM. Reason: Blue text about quoting quotation marks when they are needed
 
1 members found this post helpful.
Old 09-30-2015, 06:54 PM   #6
jmgibson1981
Senior Member
 
Registered: Jun 2015
Location: Tucson, AZ USA
Distribution: Debian
Posts: 1,140

Rep: Reputation: 392Reputation: 392Reputation: 392Reputation: 392
Could make it more universal maybe? This is what I'm using for incrementals. Put it together after googling "time machine on linux". This is clearly for remote destination. Easily modified for local.

Code:
#!/bin/bash
#mybackups
#tadaen sylvermane | jason gibson
# notes - $1 is name of folder being backed up | $2 is /path/to/folder being backed up

##### variables #####

DATABACKUP=/spinner/users/"$USER"/"$HOSTNAME" # path to backup to on server
DATADAYS=10 # days to keep backups
NOW=$(date +Y.m.d.%H.%M) # time when script is run
RSYNCOPT=auz # rsync options
SERVER=10.0.1.250 # server ip

##### begin script #####

if ping -c 1 "$SERVER" ; then
	if [[ -e "$2" ]] ; then
		if ssh "$USER"@"$SERVER" "[[ ! -e ${DATABACKUP}/${1} ]]" ; then
			ssh "$USER"@"$SERVER" "mkdir ${DATABACKUP}/${1}"
		fi
		if ssh "$USER"@"$SERVER" "[[ -e ${DATABACKUP}/${1}/current ]]" ; then
			# incremental backups, this is the one normally run
			rsync -"$RSYNCOPT" --link-dest="$DATABACKUP"/"$1"/current "$2" "$USER"@"$SERVER":"$DATABACKUP"/"$1"/"$1"."$NOW"
			ssh "$USER"@"$SERVER" "rm -f ${DATABACKUP}/${1}/current"
		else
			# initial backup, typically runs once
			rsync -"$RSYNCOPT" "$2" "$USER"@"$SERVER":"$DATABACKUP"/"$1"/"$1"."$NOW"
		fi
		ssh "$USER"@"$SERVER" "ln -s ${DATABACKUP}/${1}/${1}.${NOW} ${DATABACKUP}/${1}/current"
		ssh "$USER"@"$SERVER" "find ${DATABACKUP}/${1}/ -atime +${DATADAYS} -exec rm -rf {} \;"
	fi
fi

##### end script #####
I think you want to have some tests to make sure your backup destination is mounted, else it will just backup to the directory and eat up space that you won't even know is missing depending on how often your backup is unplugged / removed. I also see no reason to run this as root. Unless you start backing up stuff outside of your /home/$USER you probably should just run it on your personal crontab.

Last edited by jmgibson1981; 09-30-2015 at 07:14 PM.
 
1 members found this post helpful.
Old 10-10-2015, 01:13 PM   #7
FredGSanford
Senior Member
 
Registered: Nov 2005
Location: USA
Distribution: Mageia 7 - Debian 10 - Artix Linux
Posts: 1,142

Original Poster
Blog Entries: 5

Rep: Reputation: 207Reputation: 207Reputation: 207
Thanks all. I have it sorted out!
 
Old 10-19-2015, 01:08 PM   #8
debguy
Member
 
Registered: Oct 2014
Location: U.S.A.
Distribution: mixed, mostly debian slackare today
Posts: 207

Rep: Reputation: 19
YES a suggestion

unix date can be unreliable for any of serveral causes. (locale problem (or infact, bug in hacks to locale rules), clock chip setting at boot-time problem, incorrect time update issue, power issue, clock battery issue, and more!). yet another issue is that without care the time stamps on files can be incorrect. and finally: timezone change issues (local rules) v. what your (current locality) is now using, which is by edict not by any scientific system.

never rely on "time of day" to decide if a backup "had been done" or if "files can be removed". and if comparing times be very cautious because old file may be newer if you were by mistake running with the wrong time at any point in time.

TIP: be careful with time

(microsoft is infamous for clobbering time, but with mistakes made: linux will do it too. lost timestamps are a sure sign either someone unknowlegeable had access or they are files the admin did not care about the time of. if you see mirrors with incorrect time of source_code.tar.gz: that is a BIG problem - the people running mirror should be dismissed because it makes life 100 times harder for those needing access to library)
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
What happens with backup-manager.org (very useful backup script)? Murz Linux - Software 3 07-27-2010 06:46 AM
Newbie trying to write a simple backup script to backup a single folder Nd for school stryker759a Linux - Newbie 2 09-16-2009 08:52 AM
Need help with script to organise files into folders as part of DVD backup script jasybee2000 Linux - Newbie 5 06-15-2009 07:29 PM
how to create backup MYSQL Script to backup my database for every 1hour RMLinux Linux - Newbie 3 11-20-2008 10:13 AM
Backup script rcmonroig Linux - Newbie 4 04-21-2007 12:58 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 03:32 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration