LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 08-28-2007, 04:38 AM   #1
tajamari
Member
 
Registered: Jul 2007
Distribution: Red Hat CentOS Ubuntu FreeBSD OpenSuSe
Posts: 252

Rep: Reputation: 32
rsync only 60-day old files


I want to rsync only 60-day old files, what are the parameters needed to transfer only those files?
 
Old 08-28-2007, 07:01 AM   #2
IBall
Senior Member
 
Registered: Nov 2003
Location: Perth, Western Australia
Distribution: Ubuntu, Debian, Various using VMWare
Posts: 2,088

Rep: Reputation: 62
As far as I can see, there is no option to rsync that allows this.

Would it be a possibility to use find to locate files that meet your criteria, and then copy them to a temporary location, rsync to the remote location then remove the temporary copy?

--Ian
 
Old 08-29-2007, 12:00 AM   #3
tajamari
Member
 
Registered: Jul 2007
Distribution: Red Hat CentOS Ubuntu FreeBSD OpenSuSe
Posts: 252

Original Poster
Rep: Reputation: 32
Quote:
Originally Posted by IBall View Post
As far as I can see, there is no option to rsync that allows this.

Would it be a possibility to use find to locate files that meet your criteria, and then copy them to a temporary location, rsync to the remote location then remove the temporary copy?

--Ian
Ian,

I used find then execute my rsync script, but it does nor recursively transfer the directories and its subdirectories, it transfers the located file on a directory I defined.
 
Old 08-29-2007, 12:18 AM   #4
tajamari
Member
 
Registered: Jul 2007
Distribution: Red Hat CentOS Ubuntu FreeBSD OpenSuSe
Posts: 252

Original Poster
Rep: Reputation: 32
Quote:
Originally Posted by tajamari View Post
Ian,

I used find then execute my rsync script, but it does nor recursively transfer the directories and its subdirectories, it transfers the located file on a directory I defined.
Below are my scripts running on the primary data server, my problem is it does not transfer the subfolders under /home/data/accounts/state to /home/data/accounts/state if the backup machine. only files are transfered. any help please.

#FIND SCRIPT
#!/bin/bash
DAYS=60
DIR="/home/data/accounts/state"

date
FIND=`find $DIR -depth -mindepth 1 -mtime +$DAYS -exec sh transfertobackup.sh \; &`
date

#RSYNC SCRIPT - transfertobackup.sh
# /bin/bash

# Transfer all subfolders under /home/data/accounts
#---------------------------------------------------------------

rsync --stats -auvz -e "ssh -p10022" --delete --recursive --times -og /home/data/accounts/state/ testaccount@192.168.10.15:/home/data/accounts/state/
 
Old 08-29-2007, 12:03 PM   #5
choogendyk
Senior Member
 
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197

Rep: Reputation: 105Reputation: 105
I'm having a little trouble with your conceptual logic. You find anything that was last modified 60 days or more ago. Then you want to recursively copy stuff below that. But it seems the find would already include things below that. In addition, some of the items in subfolders may have been modified more recently. They could affect the mtime of their subfolder but not higher up folders. So . . .

I've done somewhat similar things using cpio. Mine are incrementals, but you could change it to oldamentals ;-) Anyway, here is a segment of my script:

Code:
case "$DAY" in
Mon)
  # delete previous contents and then do incrementals from Friday
  rm -r ${ADIR}; mkdir ${ADIR};
  find . -mtime -3 | cpio -oa 2>/dev/null | ( cd ${ADIR} && cpio -imd );;
Tue|Wed|Thu)
  # delete previous contents and then do incrementals
  rm -r ${ADIR}; mkdir ${ADIR};
  find . -mtime -1 | cpio -oa 2>/dev/null | ( cd ${ADIR} && cpio -imd );;
Fri)
  # delete previous contents and then do full copies
  rm -r ${ADIR}; mkdir ${ADIR};
  find . | cpio -oa 2>/dev/null | ( cd ${ADIR} && cpio -imd );;
esac
Of course, ADIR is the daily directory inside the archive directory. Then you can rsync from there to a remote system if you want. Or, what I do, have the script just tar up the daily directory, gzip it, and scp it.

Just another alternative.
 
Old 08-29-2007, 01:40 PM   #6
colucix
LQ Guru
 
Registered: Sep 2003
Location: Bologna
Distribution: CentOS 6.5 OpenSuSE 12.3
Posts: 10,509

Rep: Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983Reputation: 1983
Quote:
Below are my scripts running on the primary data server, my problem is it does not transfer the subfolders under /home/data/accounts/state to /home/data/accounts/state if the backup machine. only files are transfered
This is true because the result from the find command gives filenames only and you give to rsync no information on where to put files other than /home/data/accounts/state.
First, I suggest to create on the remote machine the same directory structure as in the local machine. This can be done by rsync in this way:
Code:
rsync -av --include '*/' --exclude '*'
Second, you can parse the output from the find command to retrieve the directory name (path) and tell rsync to put every single file into its own directory. Using your specifications, I would do something like:
Code:
#!/bin/bash
DAYS=60
DIR=/home/data/accounts/state
cd $DIR
rsync -av --include '*/' --exclude '*' -e "ssh -p10022" . testaccount@192.168.10.15:$DIR
for file in `find . -depth -mindepth 1 -mtime +$DAYS`
do
  backup_dir=$(dirname $file)
  rsync --stats -auvz -e "ssh -p10022" --delete --times -og $file testaccount@192.168.10.15:$DIR/$backup_dir
done
If you don't want to copy the whole directory tree, but only the relevant directiories (that is only the location of the files you want to backup) you can simply create them via ssh, that is something like:
Code:
#!/bin/bash
DAYS=60
DIR=/home/data/accounts/state
cd $DIR
for file in `find . -depth -mindepth 1 -mtime +$DAYS`
do
  backup_dir=$(dirname $file)
  ssh testaccount@192.168.10.15 mkdir -p $DIR/$backup_dir
  rsync --stats -auvz -e "ssh -p10022" --delete --times -og $file testaccount@192.168.10.15:$DIR/$backup_dir
done
Hope this helps. For obvious reasons I have not tested all the options to the find and rsync commands, so - please - double check and test with dummy files!

Last edited by colucix; 08-29-2007 at 01:42 PM.
 
Old 08-29-2007, 10:09 PM   #7
tajamari
Member
 
Registered: Jul 2007
Distribution: Red Hat CentOS Ubuntu FreeBSD OpenSuSe
Posts: 252

Original Poster
Rep: Reputation: 32
Quote:
Originally Posted by colucix View Post
This is true because the result from the find command gives filenames only and you give to rsync no information on where to put files other than /home/data/accounts/state.
First, I suggest to create on the remote machine the same directory structure as in the local machine. This can be done by rsync in this way:
Code:
rsync -av --include '*/' --exclude '*'
Second, you can parse the output from the find command to retrieve the directory name (path) and tell rsync to put every single file into its own directory. Using your specifications, I would do something like:
Code:
#!/bin/bash
DAYS=60
DIR=/home/data/accounts/state
cd $DIR
rsync -av --include '*/' --exclude '*' -e "ssh -p10022" . testaccount@192.168.10.15:$DIR
for file in `find . -depth -mindepth 1 -mtime +$DAYS`
do
  backup_dir=$(dirname $file)
  rsync --stats -auvz -e "ssh -p10022" --delete --times -og $file testaccount@192.168.10.15:$DIR/$backup_dir
done
If you don't want to copy the whole directory tree, but only the relevant directiories (that is only the location of the files you want to backup) you can simply create them via ssh, that is something like:
Code:
#!/bin/bash
DAYS=60
DIR=/home/data/accounts/state
cd $DIR
for file in `find . -depth -mindepth 1 -mtime +$DAYS`
do
  backup_dir=$(dirname $file)
  ssh testaccount@192.168.10.15 mkdir -p $DIR/$backup_dir
  rsync --stats -auvz -e "ssh -p10022" --delete --times -og $file testaccount@192.168.10.15:$DIR/$backup_dir
done
Hope this helps. For obvious reasons I have not tested all the options to the find and rsync commands, so - please - double check and test with dummy files!

thanks guys for your help... give you feedback once i got results
 
Old 03-10-2010, 02:50 PM   #8
enfran
LQ Newbie
 
Registered: Mar 2010
Posts: 1

Rep: Reputation: 0
For those people who are still looking for a way to implement an rsync restricted to files newer/older than a specific date, as I was, I summarize here the 3 simple steps I finally used.

1) find ./ -daystart -mtime +N >myfile
ie: write down everything that is older than N days

2) sed 's|./||' myfile >EXCL_list
ie: strip off the unneeded prefix

3) rsync -Pav --exclude-from=EXCL_list --delete-excluded source/ dest
ie: apply the restricted rsync, as requested

Hope this could be helpful to someone...
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
rsync and sparce files kapilcool Linux - Software 2 09-03-2007 11:45 AM
rsync ignore files Eazy-Snatch Linux - Networking 3 09-18-2006 06:45 AM
can't get rsync exclude files Batta Linux - Newbie 1 03-27-2006 10:39 PM
RSYNC and dot files Buto Linux - General 1 09-22-2005 02:39 PM
rsync skipping files s_siouris Linux - Software 4 04-06-2005 11:00 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 10:58 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration