Linux - Server This forum is for the discussion of Linux Software used in a server related context. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
11-29-2010, 07:57 AM
|
#1
|
Member
Registered: Jan 2004
Posts: 59
Rep:
|
how to create daily incremental backups easily?
I've had several HDD crashes on my personal server over the years and it's just gotten to be a real pain in the rear. Crashed again this morning.
Currently, I make monthly tarball backups of the entire filesystem using my script:
Code:
#!/bin/sh
# Removes the tarball from the previous execution.
rm -rf /backup/data/*.tar.gz
# Dates the new tarballs of current builds.
DATE=`date +%m_%d_%Y`
# Directory structure that is being tarballed.
# Have this be what is found in your "/" root
# excluding the /backup/ partition.
tar -pzcf /backup/data/bin.$DATE.tar.gz /bin/
tar -pzcf /backup/data/boot.$DATE.tar.gz /boot/
tar -pzcf /backup/data/dev.$DATE.tar.gz /dev/
tar -pzcf /backup/data/dist.$DATE.tar.gz /dist/
tar -pzcf /backup/data/etc.$DATE.tar.gz /etc/
tar -pzcf /backup/data/kernel.$DATE.tar.gz /kernel*
tar -pzcf /backup/data/lib.$DATE.tar.gz /lib
tar -pzcf /backup/data/root.$DATE.tar.gz /root/
tar -pzcf /backup/data/sbin.$DATE.tar.gz /sbin/
tar -pzcf /backup/data/stand.$DATE.tar.gz /stand/
tar -pzcf /backup/data/usr.$DATE.tar.gz /usr/
tar -pzcf /backup/data/var.$DATE.tar.gz /var/
tar -pzcf /backup/data/emul.$DATE.tar.gz /emul/
tar -pzcf /backup/data/home.$DATE.tar.gz /home/
tar -pzcf /backup/data/selinux.$DATE.tar.gz /selinux/
tar -pzcf /backup/data/srv.$DATE.tar.gz /srv/
tar -pzcf /backup/data/ssl.$DATE.tar.gz /ssl/
tar -pzcf /backup/data/sys.$DATE.tar.gz /sys/
ls -lah > /backup/data/rootmap
cp /quota* /backup/data/
I'd like to get daily backups of everything, but at least of /var/ since it contains my email, websites, and sql databases.
Any free, simple solutions out there to run such automated backups to my secondary hdd?
|
|
|
11-29-2010, 08:41 AM
|
#3
|
Member
Registered: Oct 2006
Location: Utah
Posts: 520
Rep:
|
See previous post
Last edited by fordeck; 11-29-2010 at 08:43 AM.
Reason: duplicate
|
|
|
11-29-2010, 09:25 AM
|
#4
|
LQ 5k Club
Registered: Dec 2008
Location: Tamil Nadu, India
Distribution: Debian
Posts: 8,578
|
9 out of 10 cat owners who were asked said their cats prefer rsync. Me, I like Bacula with crunchy fish bits but it's not what any sane puss would call simple.
|
|
|
11-29-2010, 12:55 PM
|
#5
|
LQ Newbie
Registered: Oct 2010
Location: Madrid - Spain
Distribution: RHEL
Posts: 26
Rep: 
|
rsync...simple and really effective
|
|
|
11-29-2010, 01:01 PM
|
#6
|
LQ 5k Club
Registered: May 2001
Location: Belgium
Distribution: Arch
Posts: 8,529
|
+1 for rsync
Quote:
# Removes the tarball from the previous execution.
rm -rf /backup/data/*.tar.gz
|
BTW, I would remove the old backups, AFTER the new backup is made.
Kind regards
Last edited by repo; 11-29-2010 at 01:03 PM.
|
|
|
11-29-2010, 10:25 PM
|
#7
|
Member
Registered: Jan 2004
Posts: 59
Original Poster
Rep:
|
any suggestions for the rsync thing?
i'd like to do a backup every 24 hrs on my /backup/ harddrive.
1) /var/www/
2) /var/vmail/
3) /var/lib/mysql
not sure how it'd work out...
|
|
|
11-29-2010, 11:27 PM
|
#8
|
Member
Registered: Oct 2009
Distribution: Hackintosh, SlackWare
Posts: 267
Rep:
|
+1 for rsync. Also
Code:
#!/bin/sh
# Dates the new tarballs of current builds.
DATE=`date +%m_%d_%Y`
# Directory structure that is being tarballed.
# Have this be what is found in your "/" root
# excluding the /backup/ partition.
tar -pzcf /backup/data/bin.$DATE.tar.gz /bin/
tar -pzcf /backup/data/boot.$DATE.tar.gz /boot/
tar -pzcf /backup/data/dev.$DATE.tar.gz /dev/
tar -pzcf /backup/data/dist.$DATE.tar.gz /dist/
tar -pzcf /backup/data/etc.$DATE.tar.gz /etc/
tar -pzcf /backup/data/kernel.$DATE.tar.gz /kernel*
tar -pzcf /backup/data/lib.$DATE.tar.gz /lib
tar -pzcf /backup/data/root.$DATE.tar.gz /root/
tar -pzcf /backup/data/sbin.$DATE.tar.gz /sbin/
tar -pzcf /backup/data/stand.$DATE.tar.gz /stand/
tar -pzcf /backup/data/usr.$DATE.tar.gz /usr/
tar -pzcf /backup/data/var.$DATE.tar.gz /var/
tar -pzcf /backup/data/emul.$DATE.tar.gz /emul/
tar -pzcf /backup/data/home.$DATE.tar.gz /home/
tar -pzcf /backup/data/selinux.$DATE.tar.gz /selinux/
tar -pzcf /backup/data/srv.$DATE.tar.gz /srv/
tar -pzcf /backup/data/ssl.$DATE.tar.gz /ssl/
tar -pzcf /backup/data/sys.$DATE.tar.gz /sys/
ls -lah > /backup/data/rootmap
cp /quota* /backup/data/
# Removes the tarball from the previous execution.
rm -rf /backup/data/*.tar.gz
Works more safely than what you had.
|
|
|
11-30-2010, 02:18 AM
|
#9
|
LQ 5k Club
Registered: May 2001
Location: Belgium
Distribution: Arch
Posts: 8,529
|
Quote:
Originally Posted by crypted
any suggestions for the rsync thing?
i'd like to do a backup every 24 hrs on my /backup/ harddrive.
1) /var/www/
2) /var/vmail/
3) /var/lib/mysql
not sure how it'd work out...
|
Start with reading the documentation for rsync.
Then create a script to backup your directories.
Make sure the script works.
Create a cronjob for the script.
Some info
http://www.thegeekstuff.com/2010/09/...mand-examples/
http://www.cyberciti.biz/tips/tag/rsync-examples
Kind regards
|
|
|
11-30-2010, 06:54 AM
|
#10
|
Senior Member
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197
Rep: 
|
I like rsync too, and use it regularly. However, the original post did say incremental. That would be rdiff-backup, which uses librsync. See http://www.nongnu.org/rdiff-backup/, or, for the general concept and to roll your own with rsync, see http://www.mikerubel.org/computers/rsync_snapshots/.
I would also note that in the original configuration, rather than deleting all the old tarballs with the *, you could use `find /backup/data/ -name '*.tar.gz' -mtime +7 -exec rm {} \;`. That would remove any tarballs older than 7 days. Of course, make sure you have enough space for the extras, since they are all full backups. You can do incrementals with gnutar, but then you're getting more complicated (scripting which day to run fulls, incrementals on others, removing the older ones, etc.), and you might as well use something off the shelf.
As far as cat owners go, I like Amanda  , and it's not too complicated to get running (there is a quick start with backup to disk on the wiki); but, both Amanda and Bacuala come into their own in networked environments with multiple machines being backed up. For a single computer, there are many simpler solutions.
|
|
|
12-11-2010, 04:46 PM
|
#11
|
Member
Registered: Jan 2004
Posts: 59
Original Poster
Rep:
|
I've tried using the find command above from choogendyk.
It doesn't work.
Code:
my:/backup# find /backup/databases/ -name '*.sql' -mtime +30 -exec rm {} \
>
Just sits there at >
Code:
my:/backup/databases# find /backup/databases/ -name '*.sql' -mtime +30 -exec rm {}
find: missing argument to `-exec'
How could I get this to work?
It'd be very useful to remove backup files whether tar.gz or sql dumps (not incremental) after so many days...
|
|
|
12-11-2010, 09:02 PM
|
#12
|
Member
Registered: Nov 2010
Distribution: Debian Lenny
Posts: 136
Rep:
|
Try putting a ; on the end.
Code:
my:/backup# find /backup/databases/ -name '*.sql' -mtime +30 -exec rm {} \;
|
|
|
12-11-2010, 10:24 PM
|
#13
|
Member
Registered: Jan 2004
Posts: 59
Original Poster
Rep:
|
Ah yes, correct you are! Thanks for catching that...
|
|
|
12-12-2010, 02:41 AM
|
#14
|
Member
Registered: Nov 2010
Distribution: Debian Lenny
Posts: 136
Rep:
|
Quote:
Originally Posted by crypted
Ah yes, correct you are! Thanks for catching that...
|
 Yep, I'll probably do that a few more times too!
|
|
|
All times are GMT -5. The time now is 01:18 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|