how to create daily incremental backups easily?
I've had several HDD crashes on my personal server over the years and it's just gotten to be a real pain in the rear. Crashed again this morning.
Currently, I make monthly tarball backups of the entire filesystem using my script: Code:
#!/bin/sh Any free, simple solutions out there to run such automated backups to my secondary hdd? |
|
See previous post
|
9 out of 10 cat owners who were asked said their cats prefer rsync. Me, I like Bacula with crunchy fish bits but it's not what any sane puss would call simple.
|
rsync...simple and really effective
|
+1 for rsync
Quote:
Kind regards |
any suggestions for the rsync thing?
i'd like to do a backup every 24 hrs on my /backup/ harddrive. 1) /var/www/ 2) /var/vmail/ 3) /var/lib/mysql not sure how it'd work out... |
+1 for rsync. Also
Code:
#!/bin/sh |
Quote:
Then create a script to backup your directories. Make sure the script works. Create a cronjob for the script. Some info http://www.thegeekstuff.com/2010/09/...mand-examples/ http://www.cyberciti.biz/tips/tag/rsync-examples Kind regards |
I like rsync too, and use it regularly. However, the original post did say incremental. That would be rdiff-backup, which uses librsync. See http://www.nongnu.org/rdiff-backup/, or, for the general concept and to roll your own with rsync, see http://www.mikerubel.org/computers/rsync_snapshots/.
I would also note that in the original configuration, rather than deleting all the old tarballs with the *, you could use `find /backup/data/ -name '*.tar.gz' -mtime +7 -exec rm {} \;`. That would remove any tarballs older than 7 days. Of course, make sure you have enough space for the extras, since they are all full backups. You can do incrementals with gnutar, but then you're getting more complicated (scripting which day to run fulls, incrementals on others, removing the older ones, etc.), and you might as well use something off the shelf. As far as cat owners go, I like Amanda ;) , and it's not too complicated to get running (there is a quick start with backup to disk on the wiki); but, both Amanda and Bacuala come into their own in networked environments with multiple machines being backed up. For a single computer, there are many simpler solutions. |
I've tried using the find command above from choogendyk.
It doesn't work. Code:
my:/backup# find /backup/databases/ -name '*.sql' -mtime +30 -exec rm {} \ Code:
my:/backup/databases# find /backup/databases/ -name '*.sql' -mtime +30 -exec rm {} It'd be very useful to remove backup files whether tar.gz or sql dumps (not incremental) after so many days... |
Try putting a ; on the end. :)
Code:
my:/backup# find /backup/databases/ -name '*.sql' -mtime +30 -exec rm {} \; |
Ah yes, correct you are! Thanks for catching that...
|
Quote:
|
All times are GMT -5. The time now is 10:06 PM. |