LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 05-17-2017, 04:44 PM   #1
JamesM83
LQ Newbie
 
Registered: May 2017
Posts: 2

Rep: Reputation: Disabled
more complex rsync cron job script


Hi,

I am having trouble with setting up a bash script, and thought I would ask here for some help. I have the following setup:

(1) two servers A and B running ubuntu
(2) server A constantly downloads data into directory ~/data/ into a deep subdirectory structure. Each download is done at most after 15 minutes.
(3) Once per day I want to move data, that is older than 15 minutes from ~/data/ to ~/transferdata/ preserving the subdirectory structure
(4) this should then be transferred via rsync to server B:~/data/
(5) if this succeeded, the directory A:~/transferdata/ should be gzipped into A:~/transfers/[currentDate.tar.gz] and if that is succesfull, the directory should be cleared
(6) if all of this is succesful, another script called notifySuccessB.sh should be executed, on any failure in between notifyErrorB.sh should be executed

Could you give me some hints, especially on (3) because I couldn't find a flag for rsync that depended on the modification date, and (6) as I don't really know how to react on errors within a script?

Thanks, James
 
Old 05-17-2017, 04:51 PM   #2
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,573

Rep: Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142
For (3), you could use find's -ctime (or -mtime) flag to find the files order than 15 minutes, then use exec to make the necessary directory structure at the destination and copy the file over

Checking for errors in a script is straight forward, but you have to do it yourself, the script won't do it for you. The variable "$?" always contains the exit code of the most recently executed command. 99.99999% of the time an exit code of 0 means it was successful, anything else means something went wrong. You'd have to check the documentation for the program to see what each flag means, but it's generally a safe bet to use 0 for good, !=0 for bad. For example:

Code:
rsync -av $dir1 $dir2
stat=$?
if [[ $stat -ne 0 ]]; then
   echo "rsync failed with exit code $stat"
fi

tar -zcf $(date +%Y%m%d).tgz $dir1
stat=$?
if [[ $stat -ne 0 ]]; then
   echo "tar failed with exit code $stat"
fi
 
Old 05-17-2017, 08:35 PM   #3
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 21,119

Rep: Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120
I would be (extremely) concerned with the complexity of this. And resultant likely fragility.
I'm a big believer in snapshot - have been for over a decade. I would snap once a day ignoring the 15 min requirement. No data movement, instantaneous. Rsync, then tar, do what you want, with the snap as source, knowing the data can't change.
When it's done, delete the snap to recover any space allocated in the meantime. LVM offers this functionality.

At the target end worry about the 15 min problem - easier to delete the last 15 min of files than copy/move 23 hours 45 min worth ...
KISS.
 
Old 05-18-2017, 06:04 AM   #4
JeremyBoden
Senior Member
 
Registered: Nov 2011
Location: London, UK
Distribution: Debian
Posts: 1,947

Rep: Reputation: 511Reputation: 511Reputation: 511Reputation: 511Reputation: 511Reputation: 511
The OP's suggested procedure is very fragile for data - how would a restore of data work?
 
Old 05-18-2017, 07:54 AM   #5
JamesM83
LQ Newbie
 
Registered: May 2017
Posts: 2

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by suicidaleggroll View Post
For (3), you could use find's -ctime (or -mtime) flag to find the files order than 15 minutes, then use exec to make the necessary directory structure at the destination and copy the file over
How would I make the necessary directory structure?


Unfortunately I am not able to create snapshots on that machine, since I am not allowed to make changes to the LVM.

In which sense is it very fragile?

Regarding restoring data, this is actually not a concern, since as soon as the data is on server B this is of no concern anymore, server A is purely a server that gets the data, and should once a day forward the data, no local backup is really needed (step 5 is only for sanity checks in the beginning)

Last edited by JamesM83; 05-18-2017 at 07:59 AM.
 
Old 05-18-2017, 08:26 AM   #6
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,289
Blog Entries: 3

Rep: Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718Reputation: 3718
Directory structure aside, one option is to have incron monitor the directory for completed files using the IN_CLOSE_WRITE trigger. That could be used to call rsync on the file or the newest files.
 
Old 05-18-2017, 08:59 AM   #7
JeremyBoden
Senior Member
 
Registered: Nov 2011
Location: London, UK
Distribution: Debian
Posts: 1,947

Rep: Reputation: 511Reputation: 511Reputation: 511Reputation: 511Reputation: 511Reputation: 511
Provided that all files are forced to disk on server A and that there are no incomplete transactions on the fifteen minute boundary and also provided that the rsync doesn't fail in mid-flight then there is no problem.

I would suggest you probably need to close any databases before the start of the rsync and re-open them on successful completion.
 
Old 05-18-2017, 10:20 AM   #8
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,573

Rep: Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142
Quote:
Originally Posted by JamesM83 View Post
How would I make the necessary directory structure?
A combination of dirname and mkdir -p. It may make the most sense to write your own little script to do it and call that from exec, eg:
Code:
find . -mmin +15 -exec mycopy "{}" "$dir2" \;
mycopy:
Code:
if [[ $# -eq 2 ]]; then
   if [[ -f "$1" ]]; then
      path=$(dirname "$1")
      if [[ ! -d "$2/$path" ]]; then
         mkdir -p "$2/$path"
      fi
      cp -a "$1" "$2/$path/"
   fi
fi

Last edited by suicidaleggroll; 05-18-2017 at 10:22 AM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] cron job - run a backup then rsync results to mounted drive lukester Linux - Newbie 8 03-28-2012 06:53 PM
rsync not running in cron job; please help Ravishankarappa Linux - Newbie 8 08-17-2011 12:11 PM
how can i run rsync as a cron job using public keys authentication disorderly Linux - Server 14 03-13-2008 09:51 PM
adding a perl script to cron.daily / cron.d to setup a cron job CrontabNewBIE Linux - Software 6 01-14-2008 08:16 AM
Cannot execute rsync as a cron job jdaniels73 Linux - Software 2 09-03-2006 05:03 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 06:18 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration