Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Distribution: Ubuntu, Debian, Various using VMWare
Posts: 2,088
Rep:
As far as I can see, there is no option to rsync that allows this.
Would it be a possibility to use find to locate files that meet your criteria, and then copy them to a temporary location, rsync to the remote location then remove the temporary copy?
Distribution: Red Hat CentOS Ubuntu FreeBSD OpenSuSe
Posts: 252
Original Poster
Rep:
Quote:
Originally Posted by IBall
As far as I can see, there is no option to rsync that allows this.
Would it be a possibility to use find to locate files that meet your criteria, and then copy them to a temporary location, rsync to the remote location then remove the temporary copy?
--Ian
Ian,
I used find then execute my rsync script, but it does nor recursively transfer the directories and its subdirectories, it transfers the located file on a directory I defined.
Distribution: Red Hat CentOS Ubuntu FreeBSD OpenSuSe
Posts: 252
Original Poster
Rep:
Quote:
Originally Posted by tajamari
Ian,
I used find then execute my rsync script, but it does nor recursively transfer the directories and its subdirectories, it transfers the located file on a directory I defined.
Below are my scripts running on the primary data server, my problem is it does not transfer the subfolders under /home/data/accounts/state to /home/data/accounts/state if the backup machine. only files are transfered. any help please.
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep:
I'm having a little trouble with your conceptual logic. You find anything that was last modified 60 days or more ago. Then you want to recursively copy stuff below that. But it seems the find would already include things below that. In addition, some of the items in subfolders may have been modified more recently. They could affect the mtime of their subfolder but not higher up folders. So . . .
I've done somewhat similar things using cpio. Mine are incrementals, but you could change it to oldamentals ;-) Anyway, here is a segment of my script:
Code:
case "$DAY" in
Mon)
# delete previous contents and then do incrementals from Friday
rm -r ${ADIR}; mkdir ${ADIR};
find . -mtime -3 | cpio -oa 2>/dev/null | ( cd ${ADIR} && cpio -imd );;
Tue|Wed|Thu)
# delete previous contents and then do incrementals
rm -r ${ADIR}; mkdir ${ADIR};
find . -mtime -1 | cpio -oa 2>/dev/null | ( cd ${ADIR} && cpio -imd );;
Fri)
# delete previous contents and then do full copies
rm -r ${ADIR}; mkdir ${ADIR};
find . | cpio -oa 2>/dev/null | ( cd ${ADIR} && cpio -imd );;
esac
Of course, ADIR is the daily directory inside the archive directory. Then you can rsync from there to a remote system if you want. Or, what I do, have the script just tar up the daily directory, gzip it, and scp it.
Below are my scripts running on the primary data server, my problem is it does not transfer the subfolders under /home/data/accounts/state to /home/data/accounts/state if the backup machine. only files are transfered
This is true because the result from the find command gives filenames only and you give to rsync no information on where to put files other than /home/data/accounts/state.
First, I suggest to create on the remote machine the same directory structure as in the local machine. This can be done by rsync in this way:
Code:
rsync -av --include '*/' --exclude '*'
Second, you can parse the output from the find command to retrieve the directory name (path) and tell rsync to put every single file into its own directory. Using your specifications, I would do something like:
If you don't want to copy the whole directory tree, but only the relevant directiories (that is only the location of the files you want to backup) you can simply create them via ssh, that is something like:
Code:
#!/bin/bash
DAYS=60
DIR=/home/data/accounts/state
cd $DIR
for file in `find . -depth -mindepth 1 -mtime +$DAYS`
do
backup_dir=$(dirname $file)
ssh testaccount@192.168.10.15 mkdir -p $DIR/$backup_dir
rsync --stats -auvz -e "ssh -p10022" --delete --times -og $file testaccount@192.168.10.15:$DIR/$backup_dir
done
Hope this helps. For obvious reasons I have not tested all the options to the find and rsync commands, so - please - double check and test with dummy files!
Distribution: Red Hat CentOS Ubuntu FreeBSD OpenSuSe
Posts: 252
Original Poster
Rep:
Quote:
Originally Posted by colucix
This is true because the result from the find command gives filenames only and you give to rsync no information on where to put files other than /home/data/accounts/state.
First, I suggest to create on the remote machine the same directory structure as in the local machine. This can be done by rsync in this way:
Code:
rsync -av --include '*/' --exclude '*'
Second, you can parse the output from the find command to retrieve the directory name (path) and tell rsync to put every single file into its own directory. Using your specifications, I would do something like:
If you don't want to copy the whole directory tree, but only the relevant directiories (that is only the location of the files you want to backup) you can simply create them via ssh, that is something like:
Code:
#!/bin/bash
DAYS=60
DIR=/home/data/accounts/state
cd $DIR
for file in `find . -depth -mindepth 1 -mtime +$DAYS`
do
backup_dir=$(dirname $file)
ssh testaccount@192.168.10.15 mkdir -p $DIR/$backup_dir
rsync --stats -auvz -e "ssh -p10022" --delete --times -og $file testaccount@192.168.10.15:$DIR/$backup_dir
done
Hope this helps. For obvious reasons I have not tested all the options to the find and rsync commands, so - please - double check and test with dummy files!
thanks guys for your help... give you feedback once i got results
For those people who are still looking for a way to implement an rsync restricted to files newer/older than a specific date, as I was, I summarize here the 3 simple steps I finally used.
1) find ./ -daystart -mtime +N >myfile
ie: write down everything that is older than N days
2) sed 's|./||' myfile >EXCL_list
ie: strip off the unneeded prefix
3) rsync -Pav --exclude-from=EXCL_list --delete-excluded source/ dest
ie: apply the restricted rsync, as requested
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.