LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 10-25-2015, 07:52 AM   #1
Entropy1024
Member
 
Registered: Dec 2012
Location: UK
Distribution: Ubuntu 16 & 17
Posts: 131

Rep: Reputation: Disabled
Backup using rsync


I have got rsync working to backup my Documents, Videos & Pictures folders to a NAS drive using the following:
Code:
rsync -av masterfolder backupfolder
Works very well. If something is deleted from one of the master folders then the backup copy is NOT deleted, this is a good thing in case of accidental deletion. I can always recover the file. Then each month I will burn a DVD of this data from the backup folders.

However over time the backup folders will start to bloat with copies of old files that have been deleted from the master folders. What I would like to do is set up some command to do the following:

Delete any files in the backup folders that have no longer existed in the master folders for the last 3 months.

Hope that makes sense

That way it should keep the size of the backup folder to a sensible amount and the files deleted will have been backed up to at lease two DVDs for archiving.


Does anybody know how I might go about achieving this via the command line?

Many thanks
Tim
 
Old 10-25-2015, 08:17 AM   #2
wpeckham
LQ Guru
 
Registered: Apr 2010
Location: Continental USA
Distribution: Debian, Ubuntu, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, VSIDO, tinycore, Q4OS, Manjaro
Posts: 6,006

Rep: Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840
rsync is NOT a backup

rsync is a wonderful tool, I use it often. It is NOT a backup system.

That said, your use of it as a backup tool sounds valid enough, as long as you do not expect too much.

To clean up your area, wait until after you have burned a DVD and verified that DVD (or even written and validated TWO, just in case one goes bad), re-run your rsync command with the --delete option. This will remove all files on the target that are missing on the source. Read the rsync man page for exact syntax and more detail and options.

There are easy ways to remove all files older than 30 days, and easy ways to remove orphan files, but creating a process to do both at the same time would be challenging. Modifying your process and running a single special rsync run with the new option is easy.
 
Old 10-25-2015, 08:36 AM   #3
JeremyBoden
Senior Member
 
Registered: Nov 2011
Location: London, UK
Distribution: Debian
Posts: 1,950

Rep: Reputation: 513Reputation: 513Reputation: 513Reputation: 513Reputation: 513Reputation: 513
I use rsync in a similar way to the OP - with --delete; I only consider it as the equivalent of an unreliable copy.

However, I do use a reliable but simple backup
Code:
backup2l
to local disk, which rsync can then copy the backup, reliably, at a time when it is not being written to.
 
Old 10-25-2015, 08:38 AM   #4
hortageno
Member
 
Registered: Aug 2015
Distribution: Ubuntu 22.04 LTS
Posts: 240

Rep: Reputation: 67
You could use rsync with the --delete and --link-dest option hard linking the current backup to the previous backup. Then before creating your backup DVD, delete all backup folders older than 30 days and loop through the remaining folders starting with the oldest and copy them to another folder overwriting the previous version. This folder you burn onto a DVD.

BTW your backup strategy has a flaw. You catered for accidental file deletions, but not for overwriting of files. I also wouldn't trust DVD's for backups. They will degrade sooner or later.

Also, there are gui tools available which use rsync under the hood. Personally I use backintime, which can do things you want to do like deleting backups older than 30 days and more.
 
Old 10-25-2015, 12:23 PM   #5
bryanl
Member
 
Registered: Dec 2003
Posts: 97

Rep: Reputation: 35
There is a script by Brice Burgess that is useful to make snapshots for backup. I modified it a bit to use the rsync option --link-dest so each new snapshot only has links to the previous snapshot for unchanged files. Here's what it looks like right now:

Code:
#!/bin/bash
# Author: Brice Burgess - bhb@iceburg.net
# multi_backup.sh -- backup to a local drive using rsync. 
#         Uses hard-link rotation to keep multiple backups.

# changed cp -al to rsync --link-dest (brl 12ja2010)

# Make sure only root can run our script 
# check internal effective user ID number
# re http://www.cyberciti.biz/tips/shell-root-user-check-script.html
if [[ $EUID -ne 0 ]]; then
   echo "This script must be run as root" 1>&2
   exit 1
fi

# Directories to backup. Seperate with a space. Exclude trailing slash!
SOURCES="/home/user/Documents"

# Directory to backup to. This is where your backup(s) will be stored. No spaces in names!
# :: NOTICE :: -> Make sure this directory is empty or contains ONLY backups created by
#  this script and NOTHING else. Exclude trailing slash!
TARGET="/media/user/backup/snapshots"

# Set the number of backups to keep (greater than 1). Ensure you have adaquate space.
ROTATIONS=8

# Your EXCLUDE_FILE tells rsync what NOT to backup. Leave it unchanged if you want
# to backup all files in your SOURCES. If performing a FULL SYSTEM BACKUP, ie.
# Your SOURCES is set to "/", you will need to make use of EXCLUDE_FILE.
# The file should contain directories and filenames, one per line.
# A good example would be:
# /proc
# /tmp
# *.SOME_KIND_OF_FILE
EXCLUDE_FILE="rsync-excludes.list"

# Comment out the following line to disable verbose output
# VERBOSE="-v"

#######################################
########DO_NOT_EDIT_BELOW_THIS_POINT#########
#######################################

# Set name (date) of backup. 
BACKUP_DATE="`date +%F_%H-%M`"
# echo "backup datestamp $BACKUP_DATE"

if [ ! -x $TARGET ]; then
  echo "Backup target does not exist or you don't have permission!"
  echo "Exiting..."
  exit 2
fi

if [ ! $ROTATIONS -gt 1 ]; then
  echo "You must set ROTATIONS to a number greater than 1!"
  echo "Exiting..."
  exit 2
fi

# find oldest and newest backup directories

BACKUP_NUMBER=1
NEWEST_BACKUP="/not/found"
OLDEST_BACKUP="/not/found"
for backup in `ls -dXr $TARGET/*/`; do
  if [ $BACKUP_NUMBER -eq 1 ]; then
   # echo "newest backup found $backup"
    NEWEST_BACKUP="$backup"
  fi

  if [ $BACKUP_NUMBER -eq $ROTATIONS ]; then
   # echo "oldest backup found $backup"
    OLDEST_BACKUP="$backup"
    # break	leave remains alone
  fi

  let "BACKUP_NUMBER+=1"
done

# create backup destination directory folder 
   mkdir $TARGET/$BACKUP_DATE

if [ ! -d $TARGET/$BACKUP_DATE ]; then
  echo "Backup destination not available. Make sure you have write permission in $TARGET!"
  echo "Exiting..."
  exit 2
fi 

echo "Verifying Sources..." 
for source in $SOURCES; do
  if [ ! -x $source ]; then
     echo "Error with $source!"
     echo "Directory either does not exist, or you do not have proper permissions."
     exit 2
   fi
done

if [ -f $EXCLUDE_FILE ]; then
  echo "using exclude file $EXCLUDE_FILE"
  EXCLUDE="--exclude-from=$EXCLUDE_FILE"
fi

for source in $SOURCES; do

  PREVIOUS=$NEWEST_BACKUP$source
  DEST=$TARGET/$BACKUP_DATE/$source

  if [ ! -d $DEST ]; then
    mkdir -p $DEST
  fi

  if [ -d $PREVIOUS ]; then
    echo "snapshot $source link $PREVIOUS to $DEST"
    rsync $VERBOSE --exclude=$TARGET/ $EXCLUDE -a --delete --link-dest=$PREVIOUS $source/ $DEST
  else
    echo "backup $source to $DEST"
    rsync $VERBOSE --exclude=$TARGET/ $EXCLUDE -a --delete $source/ $DEST
  fi

done

echo "deleting backups beyond rotations"

BACKUP_NUMBER=1
for backup in `ls -dXr $TARGET/*/`; do

  if [ $BACKUP_NUMBER -gt $ROTATIONS ]; then
    echo "removing $backup"
    rm -rf $backup
  fi

  let "BACKUP_NUMBER+=1"
done

exit 0
This works fairly well and can be easily modified to suit your particular needs
 
Old 10-25-2015, 09:31 PM   #6
frankbell
LQ Guru
 
Registered: Jan 2006
Location: Virginia, USA
Distribution: Slackware, Ubuntu MATE, Mageia, and whatever VMs I happen to be playing with
Posts: 19,665
Blog Entries: 28

Rep: Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248Reputation: 6248
I use rsync in a way similar to OP to back up my laptop, which is my primary computer fr doing stuff, to my file server. I guess it would be more accurate to say that I do not "backup" the files; I "archive" them.

I specifically do not want old files deleted from the file server even if I have chosen to remove them from the laptop. From time to time, when the spirit moves, I go through the archive and delete those files I am certain I will no longer need.
 
Old 10-26-2015, 06:03 AM   #7
Entropy1024
Member
 
Registered: Dec 2012
Location: UK
Distribution: Ubuntu 16 & 17
Posts: 131

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by bryanl View Post
There is a script by Brice Burgess that is useful to make snapshots for backup. I modified it a bit to use the rsync option --link-dest so each new snapshot only has links to the previous snapshot for unchanged files. Here's what it looks like right now:
This works fairly well and can be easily modified to suit your particular needs
Wow thanks. That's quite a bit of info to take in, however I will certainly give it a go.
/me turns to bash script section of idiots guide to linux

Many thanks for the support. Much appreciated.
Tim
 
Old 10-26-2015, 06:16 AM   #8
Shadow_7
Senior Member
 
Registered: Feb 2003
Distribution: debian
Posts: 4,137
Blog Entries: 1

Rep: Reputation: 875Reputation: 875Reputation: 875Reputation: 875Reputation: 875Reputation: 875Reputation: 875
I tend to just "rsync -aRXv ./ /mnt/backupdrive" my entire system while booted to another install. And then delete all the cruft I don't want anymore. Although most times my backupdrive becomes my new boot drive and the old system becomes the archive. But I tend to run from SDHC cards or usb HDDs which I lose trust in after a certain period of usage. Plus the new shiny tends to be larger, faster, and cheaper.
 
Old 10-26-2015, 11:15 AM   #9
JeremyBoden
Senior Member
 
Registered: Nov 2011
Location: London, UK
Distribution: Debian
Posts: 1,950

Rep: Reputation: 513Reputation: 513Reputation: 513Reputation: 513Reputation: 513Reputation: 513
If you were able to do a 3-way sync, ie
Current state version, the /mnt/backupdrive version, and output the differences (without altering those first two inputs),
then you could do differential backups (and save huge amounts of storage).

This is a possibility, since rsync can be run in a "pretend mode" where it just tells you what it would do, without actually doing it...
 
Old 10-26-2015, 03:17 PM   #10
bryanl
Member
 
Registered: Dec 2003
Posts: 97

Rep: Reputation: 35
re: "If you were able to do a 3-way sync" -- this is essentially what the --link-dest option does. It uses an existing copy to compare to the target to make a backup that contains links to the existing with only the changed files copied over.

It is this linking that allows the Burgess script to maintain successive backups (how many defined in the 'rotations' value) without making full copies for each snapshot. That makes it feasible to have a lot of snapshots without taking up a whole lot of space.

For what frankbell suggests, an archive, I run one set of snapshots on a quarterly basis or so and, on another media, the daily or weekly snapshots. For a storage archive, I collapse the snapshots by copy to the archive as the 'spirit moves.'

As for files and such I don't care to backup, the exclude file is for that but, looking at the script, I am getting confused as to how that shows up in the rsync command. I need to look into that again and see if I can figure out how that works. The rsync exclude option is quite potent.

The 'pretend mode' is a good debugging tool and it is needed with tools like rsync that can do so much. It is easy to trip over your shoelaces!
 
1 members found this post helpful.
Old 10-27-2015, 06:49 AM   #11
wpeckham
LQ Guru
 
Registered: Apr 2010
Location: Continental USA
Distribution: Debian, Ubuntu, RedHat, DSL, Puppy, CentOS, Knoppix, Mint-DE, Sparky, VSIDO, tinycore, Q4OS, Manjaro
Posts: 6,006

Rep: Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840Reputation: 2840
Consider other options that will do the job...

Check out this page https://wiki.archlinux.org/index.php...c-type_backups or others (google is your friend) for simple, fast, easy rsync like true backup tools that support snapshots, incremental backups, syncing (ala rsync), de-duplication, or other useful features that you may value.

Your rsync clone system holds the most recent copy of those files, but you are not assured the BEST copy of any file. A true backup system can do so much more, with many of the advantages of rsync (or even calling rsync for part of the workload).

1.Ignore this advice if the most recent version of files is all you really need.
2. Try not to overcomplicate things. There are some wonderful backup tools out there that are terribly complex. Easier to use is also easier and faster to re-create for restoration of files after a catastrophic event!
 
1 members found this post helpful.
Old 02-27-2017, 03:50 PM   #12
daleb
LQ Newbie
 
Registered: Dec 2005
Posts: 5

Rep: Reputation: 0
B Burgess rsync script

I've come late to this thread. but I have a question. I use the original B Burgess script which works OK. I tried the revised script using dest--, but rsync always outputs multiple full backups in lieu of incremental backups. Am I missing something?

Thanks for any help.

dale b
 
Old 02-27-2017, 04:28 PM   #13
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,573

Rep: Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142
Are you sure? How do you know they're not incremental backups? The reason I ask is because at first glance they will each look complete, it's only until you start examining inodes or using du that you'll see they're incremental.

For example, see the following du output for an incremental backup I have running that uses rsync with --link-dest
Code:
# du -sh backup_20170221
6.6G    backup_20170221
#
# du -sh backup_20170222
6.6G    backup_20170222
#
# du -sh backup_20170221 backup_20170222
6.6G    backup_20170221
1.1M    backup_20170222
Run a du separately on each one and they each appear to be 6.6G, but in reality all but 1.1M of that is being shared through hard links, the total space taken up is basically the same 6.6G that each one claims to be.

If they truly are full backups, then you can use "set -xv" in the script or just start printing out variables to ensure that --link-dest is really pointing to the previous backup that rsync should link from.
 
Old 02-27-2017, 08:53 PM   #14
daleb
LQ Newbie
 
Registered: Dec 2005
Posts: 5

Rep: Reputation: 0
Thanks for your response. This is the output i get:

# du -sh 2017-02-14_18-26
19G 2017-02-14_18-26
# du -sh 2017-02-19_15-02
19G 2017-02-19_15-02
# du -sh 2017-02-27_08-41
19G 2017-02-27_08-41
# du -sh 2017-02-14_18-26 2017-02-19_15-02 2017-02-27_08-41
19G 2017-02-14_18-26
19G 2017-02-19_15-02
19G 2017-02-27_08-41

From your reply, am I to understand that these backups are 19G x 3 = 57B total? Or am I misreading it. Again, thanks for your time.
 
Old 02-27-2017, 09:06 PM   #15
jailbait
LQ Guru
 
Registered: Feb 2003
Location: Virginia, USA
Distribution: Debian 12
Posts: 8,367

Rep: Reputation: 562Reputation: 562Reputation: 562Reputation: 562Reputation: 562Reputation: 562
I use a slightly different approach to balancing getting rid of backup bloat with keeping deleted files long enough to recover them when the deletion was a mistake. I have a daily backup and a weekly backup using rsync and I use the rsync delete function. I keep three generations of both the daily backup and the weekly backup. So it takes three weeks for a deleted file to completely disappear from my backups and I keep less than three weeks of bloat.

--------------------------------
Steve Stites
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Need some help with rsync backup katto Linux - General 4 04-25-2015 09:39 PM
LXer: Rsync Backup for Windows, Linux Knoppix, and Other Smart Technologies in Handy Backup by Novos LXer Syndicated Linux News 0 12-24-2011 12:43 PM
LXer: Backup with rsync and rsync.net LXer Syndicated Linux News 0 09-14-2010 05:20 PM
Using RSync to backup a secondary off-site backup server pezdspencer Linux - Software 4 06-29-2007 04:40 PM
Using rsync to backup data. Best way to backup folders? Micro420 Linux - General 2 11-23-2006 02:13 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 02:46 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration