LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   rsync and cp -al for staggered backups (https://www.linuxquestions.org/questions/linux-software-2/rsync-and-cp-al-for-staggered-backups-332926/)

babyphil 06-12-2005 10:58 PM

rsync and cp -al for staggered backups
 
I have been given the task of setting up rsync to back up a server, fortunately all I have to do is set up rsync to backup one hard drive to another rather than actually 'remote' sync-ing.
I had to make a new backup everyday (incremental) and keep the data for up to 3 days: so after reading up on cp -al and rsync I wrote a little script based upon one i found online.

Mine reads as such:

#!/bin/sh
rm -fr /mnt/backups/backup.threedaysold
mv /mnt/backups/backup.twodaysold /mnt/backups/backup.threedaysold
mv /mnt/backups/backup.yesterday /mnt/backups/backup.twodaysold
cp -al /mnt/backups/backup.today /mnt/backups/backup.yesterday
rsync -av --delete --exclude "/mnt/backups" / /mnt/backups/backup.today

As you can see the root (/) dir is backed up into backup.today which is then hardlinked to yesterday which is moved to two etc and so on.
I ran it three times when I had the chance, and it 'worked', the first time it took a very very long time and I saw a copied filesystem under /mnt/backups/backup.today I ran it twice more both of which ran significantly aster than the first time, but still took a while as it was backing up 12+gigs of data. I checked the filesizes/data used in /mnt/ and it said 35 gigs.

I then realized that i was reading links to actual files, so I ran du which said I had 13 gigs used, so it was working just as I hoped...except the fact I was using links made me realize that perhaps the files I have backed up are not actually there. What if I had a file two days ago, accidently deleted it yesterday, and realized I needed it today? Will going to the backup folder from two days ago actually give me the file or will I just find a hard link to nothing?
Excuse my ignorance, this is a completely CLI system, and though I am not a total n00b, I am not very adept at that interface (though I have made great strides lately) I am also quite ignorant when it comes to system administration.
Running the script enough times to test the system would take me hours, my headache at trying to completely understand what I am doinf is killing me ... plus I dont have access to the machine in question at the moment...
So could someone enlighten me to as what I am doing, and what (if anything) I can do to make the system better?

Kahless 06-13-2005 02:53 AM

Understanding the differance between a hard link and a soft link might help :)

a soft link is like a web link.... if the file goes away, the link is "broken"

hard links are a little differnt. you cant "break them." If you have a hard link to a file, and you deleate the file, you dont actually loose that file unless you deleate the link too. It having another copy of the file, without using any more disk space :)


There may be a better way to do what your doing, but at least if you have hard links, you wont be loosing the file completley.

The only down side is, changing the file would change both copies too, so if you trash it in a text editor, your might have issues.

babyphil 06-14-2005 01:24 AM

Thank you, it all makes so much more sense now.
I updated the script and ran it several times on my laptop (much smaller filesystem) and it works perfectly, soon I will copy it over to the server.


All times are GMT -5. The time now is 04:03 AM.