Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place! |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
|
02-28-2017, 03:21 AM
|
#16
|
Member
Registered: May 2016
Posts: 222
Rep:
|
Quote:
Originally Posted by Entropy1024
Then each month I will burn a DVD of this data from the backup folders.
However over time the backup folders will start to bloat with copies of old files that have been deleted from the master folders. What I would like to do is set up some command to do the following:
Delete any files in the backup folders that have no longer existed in the master folders for the last 3 months.
|
I guess i would use the most easy solution and add the --delete option to rsync *after i have written the backup to the DVD.
That is not what you asked for, but i think you will always have backups of the according files/folders on the DVD.
My wild idea was to use something like
Code:
diff_list=$(diff -r source backup)
and then "find --mtime -31 --exec rm {} +" with the array diff_list somehow, but that seems way to complicated, there must be an easier solution.
Pretty sure that ain't a good idea.
|
|
|
02-28-2017, 09:44 AM
|
#17
|
LQ Guru
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,573
|
Quote:
Originally Posted by daleb
Thanks for your response. This is the output i get:
# du -sh 2017-02-14_18-26
19G 2017-02-14_18-26
# du -sh 2017-02-19_15-02
19G 2017-02-19_15-02
# du -sh 2017-02-27_08-41
19G 2017-02-27_08-41
# du -sh 2017-02-14_18-26 2017-02-19_15-02 2017-02-27_08-41
19G 2017-02-14_18-26
19G 2017-02-19_15-02
19G 2017-02-27_08-41
From your reply, am I to understand that these backups are 19G x 3 = 57B total? Or am I misreading it. Again, thanks for your time.
|
It looks like your backups really are independent, which means the --link-dest flag isn't working as expected. I'd take a look inside the script to see what things are being set to and where it's going wrong.
|
|
|
03-03-2017, 11:31 AM
|
#18
|
Member
Registered: Aug 2015
Distribution: Ubuntu 22.04 LTS
Posts: 240
Rep:
|
Quote:
Originally Posted by daleb
Thanks for your response. This is the output i get:
# du -sh 2017-02-14_18-26
19G 2017-02-14_18-26
# du -sh 2017-02-19_15-02
19G 2017-02-19_15-02
# du -sh 2017-02-27_08-41
19G 2017-02-27_08-41
# du -sh 2017-02-14_18-26 2017-02-19_15-02 2017-02-27_08-41
19G 2017-02-14_18-26
19G 2017-02-19_15-02
19G 2017-02-27_08-41
From your reply, am I to understand that these backups are 19G x 3 = 57B total? Or am I misreading it. Again, thanks for your time.
|
Can you post your complete rsync command here? The "--link-dest" argument is relative to the destination, IIRC.
|
|
|
03-04-2017, 12:44 PM
|
#19
|
LQ Newbie
Registered: Dec 2005
Posts: 5
Rep:
|
Solved
Thanks for the responses. I verified that all the "SOURCE" and "TARGET" locations were correct. I screen enabled all the comments and added a few of my own, formatted my external TARGET drive, and made a original backup. Opened a short file on my computer and made a minor change and then backed up the system again. The comments showed that all the script was running, including rsync. Rsync output showed only the one file that I changed was modified in the new backup. The size of both backups using du -sh showed 18.9G, but the second backup is now shows a size of 31M, so you were right suicidaleggroll, it seems to be working now. Unfortunately, I'm not sure what changed since all I did was add some comments.
Anyway, I can verify that the script works for me now.
|
|
|
03-07-2017, 07:25 AM
|
#20
|
Senior Member
Registered: Jun 2015
Location: Tucson, AZ USA
Distribution: Fedora Kinoite
Posts: 1,197
|
I can't understand why some here say rsync is not a backup. Works fine for me if you use it correctly. This is what I'm using. Found it on google and modified it to my purposes. Currently backing up to server, planning on adding a nas in the near future. In my case I have the cronjob run every 12 hours on my laptop, keeps however many days I choose for backups. Currently most of them are 15 days, some are 30 depending on the folders I want to backup.
Code:
#!/bin/bash
# tadaen sylvermane | jason gibson
# last edit date - 2016/9/20
# version 1.0
##### variables #####
NOW=$(date +%Y.%m.%d.%H.%M) # time when script is run
RSYNCOPT=auz # /usr/bin/rsync options
SERVER=failbox # server ip
FOLDER=$(basename "$1")
BACKUPPATH="$USER"/"$HOSTNAME"/"$FOLDER"
##### begin script #####
if ping -c 1 "$SERVER" ; then
if [[ -e "$1" ]] ; then
if ssh "$USER"@"$SERVER" "[[ ! -e ${2}/${BACKUPPATH} ]]" ; then
ssh "$USER"@"$SERVER" "mkdir -p ${2}/${BACKUPPATH}"
fi
if ssh "$USER"@"$SERVER" "[[ -e ${2}/${BACKUPPATH}/current ]]" ; then
rsync -"$RSYNCOPT" --link-dest="$2"/"$BACKUPPATH"/current "$1" "$USER"@"$SERVER":"$2"/"$BACKUPPATH"/"$FOLDER"."$NOW"
ssh "$USER"@"$SERVER" "rm -f ${2}/${BACKUPPATH}/current"
else
rsync -"$RSYNCOPT" "$1" "$USER"@"$SERVER":"$2"/"$BACKUPPATH"/"$FOLDER"."$NOW"
fi
ssh "$USER"@"$SERVER" "ln -s ${2}/${BACKUPPATH}/${FOLDER}.${NOW} ${2}/${BACKUPPATH}/current"
ssh "$USER"@"$SERVER" "find ${2}/${BACKUPPATH}/ -type d -maxdepth 1 -mtime +${3} -exec rm -rf {} \;"
fi
fi
##### end script #####
Last edited by jmgibson1981; 03-07-2017 at 07:27 AM.
|
|
|
03-07-2017, 10:40 AM
|
#21
|
LQ Newbie
Registered: Feb 2017
Posts: 23
Rep:
|
With respect to backups/deleted files, why not just use some kind of network recycling bin?
|
|
|
03-07-2017, 11:12 AM
|
#22
|
LQ Guru
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,573
|
Quote:
Originally Posted by oguruma
With respect to backups/deleted files, why not just use some kind of network recycling bin?
|
Because that's just as, if not more complicated to set up, and only protects you from accidentally deleted files. An incremental backup system like this protects you from accidentally deleted files, intentionally deleted files that you later change your mind on (which you might have already "emptied" out of this recycling bin), changed files that you need to recover the old version of, corruption, viruses, and many other problems.
|
|
|
03-07-2017, 07:08 PM
|
#23
|
LQ Guru
Registered: Apr 2010
Location: Continental USA
Distribution: Debian, Ubuntu, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, VSIDO, tinycore, Q4OS, Manjaro
Posts: 6,008
|
Quote:
Originally Posted by jmgibson1981
I can't understand why some here say rsync is not a backup. Works fine for me if you use it correctly. This is what I'm using. Found it on google and modified it to my purposes. Currently backing up to server, planning on adding a nas in the near future. In my case I have the cronjob run every 12 hours on my laptop, keeps however many days I choose for backups. Currently most of them are 15 days, some are 30 depending on the folders I want to backup.
|
I can tell you why they say that. They say that rsync is not a backup program because rsync is not a backup program.
Rsync is a file/folder sync program. You can use it as a tool to build a backup system, but it is not by itself ideal for backups. Programs that ARE backup programs and that CAN BE ideal for that purpose are programs like BURP, AMANDA, ADSM (TSM Backup), BACULA, Backup Ninja, BackupPC, BackupBox, etc.
Some of those use the RSYNC lib algorythms as a tool to mak backup and network traffic more efficient, but they lay on family tree scheduling and generations, point in time restore, retention rules, some offer backup image duplication for offsite storage, and other features of TRUE backup programs.
This is not to say that you cannot use archive tools (such as tar), file copy and sync tools (such as rsync), and build your own primitive backup system. You can. But you will not get a robust backup solution as you would from something built and perfected for backup purposes by a team dedicated to that goal. I have built and used system in such a way, and they saved my bacon. When I have resources and opportunity, I prefer to us better backup and recovery tools in production environments.
If your cobbled together system works for you, that is all that need be said. If a business depends upon your backups, you might want to add another layer of protection using something that was created for that purpose. It all depends upon your risk level, and risk analysis.
But ...
Be not surprised if when you mention that you do 'rsync backups' you get people telling you that 'rsync is not backup' as you did here. They are right. That does not mean that your are wrong.
|
|
|
03-07-2017, 09:00 PM
|
#24
|
Member
Registered: Apr 2013
Location: Massachusetts
Distribution: Debian
Posts: 529
|
Quote:
Originally Posted by daleb
I've come late to this thread. but I have a question. I use the original B Burgess script which works OK. I tried the revised script using dest--, but rsync always outputs multiple full backups in lieu of incremental backups. Am I missing something?
|
Quote:
Originally Posted by daleb
Thanks for the responses. I verified that all the "SOURCE" and "TARGET" locations were correct. I screen enabled all the comments and added a few of my own, formatted my external TARGET drive, ... it seems to be working now. Unfortunately, I'm not sure what changed since all I did was add some comments.
|
You also formatted the target drive, and that could make the difference. --link-dest only works within target filesystems that support hardlinks. If the drive was originally some type of FAT filesystem, for example, hardlinks are not available and --link-dest would be ignored. I'm glad it's working now.
|
|
|
03-09-2017, 11:08 AM
|
#25
|
Member
Registered: Sep 2015
Distribution: Debian
Posts: 297
Rep:
|
Gotta Be An Easier Way
There's gotta be a better way to go about this. I searched the web for FOSS backup utilities a few months ago, and I found a few, including DejaDup.
Admittedly, Linux is just a secondary operating system for me (for now), so I haven't really done a whole lot of backups. And I've probably been spoiled by Apple's easy-to-use Time Machine backup utility.
Update: Looks like I was right: http://alternativeto.net/software/back-in-time/
Last edited by Mr. Macintosh; 03-09-2017 at 11:23 AM.
Reason: Added link to Back In Time
|
|
|
03-09-2017, 04:15 PM
|
#26
|
Moderator
Registered: Mar 2008
Posts: 22,177
|
To clarify the post.
The OP posted 10-25-15, 08:52 AM
Then at some point a similar discussion attached to this thread. The issue is still an rsync script.
daleb reports his issue is solved.
|
|
|
All times are GMT -5. The time now is 04:08 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|