Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hi
Can anyone help im stuck trying to get rsync to delete files on my destination location when using a find command to find files from the last 90 days
I am using find /source -mtime -90 to find the files i want then trying to try and get rsync to sync them but including deletions on the destination folder
I have tried many rsync command trying files-from switch |rsync and > rsync and -exec rsync {} etc.
Distribution: Debian /Jessie/Stretch/Sid, Linux Mint DE
Posts: 5,195
Rep:
I don't fully understand what your need is.
Do you want to copy files to the destination which are older than 90 days?
Or do you want to delete files in the destination older that 90 days?
When you want to delete, are those files still on the source?
Do you want to keep them on the source?
What I think you want to do is this:
All files older than 90 days which are on destination must be deleted, regardless their existence on the source.
In that case I can recommend not to use rsync. Rsync is for syncing, but you want to do something else which is not syncing. Rsync is not good at that.
A better approach might be to run a remote find command on the destination deleting files older than 90 days. When you ssh into the destination you can append a command to ssh. In this case it would be a find command.
Hi
Thanks for replying, let me try and explain better.
I have a source location which will have files created, modified and deleted on. I want to sync the last 90 days of this location to another location. I can find changes and updates but not get rsync to delete on the location if a file is deleted on the source.
The reason for this work is to have the last 90 days of users data in sync in another location.
I'm also looking into doing what he is describing. Did anyone find an solution for this? I have a source that I want to rsync to a destination but I want the files on the destination to stay around for 90 days after they have been removed from the source. Additionally I would accept an alternative solution which would accomplish the same goal.
This could be done with a daily backup script which performs an incremental rsync backup by using the --link-dest option, and deletes any backups older than 90 days. You don't need the find command or the rsync --delete option.
I am not sure this is what you asked for or what you need, but I'll tell you what it will do for you: For each day in the desired 90-day history, you will be able to access the files as they existed at the time of the daily backup. The backups could be saved in date-coded directories, for example, a directory called 2014-12-13 for today's backup, a directory called 2014-12-12 for yesterday's backup, and so on. Since you used the rsync --link-dest option, files which did not change will be hardlinked, which saves disk space; the same file will not be stored twice, but it will appear in multiple daily backup directories because of the hardlinks.
If this sounds like what you need, let me know, and I could give you more details about how to write the script.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.