Backing up and Archiving within Server
I have been trying to find an application that allows the backing up and archiving of a directory (/server). I have been looking at rsync and tar, but I don't think they do what I would like to have.
Basically this directory is shared over Samba, and I would like to use Crone to create hourly incremental backups and monthly archiving. The /Backup and /Archive directories are also shares over Samba, but read-only.
I am looking at an hourly backup of any file that has changed in the past hour, with the "old" files being moved to another directory or just the new file being copied to /day/hour (first method would be good as that way I have most recent backup complete, with old files stored in other directories (something like /backup/current and /backup/day/hour for all others))
I would also like to run a process at the end of each day (say 11pm) whereby the last hourly backup is into the /backup/day folder, and all of the ./hour folders are removed.
Then at the end of each week the ./day folders are purged in the same way so that I end up with the following in the /backup folder
You get what I mean!
I would also like to run a monthly process to move any directory within /server that does not contain a file that has not been modified within the past 6 months to /archive. However if the file already exists then it can't overwrite, but it should rename and move!
The /archive and /backup directory are the mount points of two separate removable drives. The Backup drive is to be removed from site each night, whilst the Archive drive is to be replaced when it is full.
Also, what program can I set-up to send notifications of drive capacity?
I am running a Ubuntu Server 8.10 machine with no GUI. IF I have to install one I will, but i'd prefer to leave it without the GUI as it is strictly a file server, mainly for photos.
Do You think that somebody will write a script for such a big task ?
Do You think that You have a fast enought HDD ?
Have You ever try to copy for example 50GB of files with size from 100Kb - 10MB ? I think Your pc need more than one hour for this.
Hourly backup of photos... ehhh hmmm what will be in the situation when somebody will try to write file and the same moment Your backup script will try to zip this ? You will have shitty files in Your backup...
Next thing is... What will be in the situation if for example at 9 oclock You will make a mistake in Your photo work (script will put this to the incremental backup) and on the next day You would like to restore it ? Do You think that You can restore correct data then ?
I dont think so...
You working with a lot of photos (I dont think so that with 2 ;)) and You would like to have incremental HOURLY backup ? Please.... :scratch:
Have You ever try to copy for example 50GB of files with size between 100Kb - 10MB ? I think Your pc need more than one hour for this.
Second thing is ... What kind of files do You want to backup ?
For example some picture files You cannot compress - JPG - its waisting of time... For faster transfer between disks You can create tar without compression. For TIF`s, RAW, PDF`s, PSD, TGA, You can use zip compression.
In my opinion the best solution for You is http://www.alienbrain.com/
Have You ever try to work with multimedia repository ?
Tar - doesnt support multicore processors, so compression of TIF`s, TAR`s, RAW`s will take an hours.... not MINUTES!
You can also try with Subversion to keep binary files. But I dont think so that is good for multimedia database...
Think once again about this hourly backup ;) for me its funny idea :D WHAT for hourly ? ITs much more better to keep work history for example:
than works with one file and incremental backup :D
Sorry for any confusion - I want incremental backups every hour, not a full backup - I would like to have one directory that is the most recent version of the backup with the historical increments behind it! But if I can't then no problem, as long as I have incremental backups on a drive ready to take home!
I understand that I can just so as per your example below, but I am trying to avoid any accidental overwrites / deletions.
I had a look at Alienbrain as you suggested - $12,000 US for the starter pack!!!?!?!??!
I don't have that sort of cash, and I just need a simple backup and archive program.
Surely most people with a file do this archiving and backup? I can't be the first to look at it, can I?!?!
I am not sure if you have looked into LVM (Logical Volume Manager) in linux. I think this would be better done by taking backups of the file system you want to backup. Essentially you are only backing up changed data blocks this way and this is instant (no huge load spike when you take the backup).
Here is something to get started:
|All times are GMT -5. The time now is 09:24 AM.|