Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
03-15-2007, 09:56 AM
|
#1
|
Senior Member
Registered: Feb 2006
Location: Austria
Distribution: Ubuntu 12.10
Posts: 1,142
Rep:
|
How do you people do your backups?
After some severe data loss I'm getting a little more concerned about the issue.
I do backups, but not regularly, and without much of a system. I know several ways to do it, but I'm trying to get input from some of you people in person.
I'm not so sure how to exclude some directories from being... for example packed into a tar file by a cron job (fist thing that comes to mind)...
Also I don't want them to be overwritten by the next backup right away. I need a weekly backup of about 30 gig - obviously only changes should be included), And I'm willing to sacrifice a 250 gig drive which I currently use for my static backups... ~100 gig free.
Last edited by oskar; 03-15-2007 at 09:59 AM.
|
|
|
03-15-2007, 11:26 AM
|
#2
|
Member
Registered: Dec 2005
Location: Ohio, USA
Distribution: OpenSuSE 10.2
Posts: 74
Rep:
|
I use dar with a custom Bash script to do all my backups. Dar is capable of doing what I think you need. It will do incrementals and differentials based on previous backups as well as the ability to slice the backup into smaller files. You can also exclude directories and best of all it will also include file deletions in your differential backups based on a previous full backup. For instance, say you do a full backup. Then you end up deleting certain files. When you do a differential backup based on that full backup, dar will record that you deleted those files in the differential backup. In that way, you can restore your system to its exact state by first restoring your full backup, then applying the differential. I've restored my system countless times using dar and so far, never a problem. The bash script is set to run in cron in the wee hours of the morning when the system is up. Also, I have it change to runlevel 2 to reduce any system chatter while the backup is running. Make sure you trap any system signals in your Bash script just in case the system wants to shutdown in the middle of a backup. That way you can clean up before it goes down and not have a corrupted backup file. I could send you my bash script if you would like to look at it. Hope this helps.
|
|
|
03-15-2007, 12:45 PM
|
#3
|
Member
Registered: Apr 2006
Location: Cape Town, South Africa
Distribution: Gentoo 2006.1(2.6.17-gentoo-r7)
Posts: 222
Rep:
|
Windows: 'ntbackup'
GNU/Linux: tar and copy to cd-r's or dvd-r's
|
|
|
03-15-2007, 12:49 PM
|
#4
|
Member
Registered: Sep 2006
Location: Dayton, Ohio
Distribution: Slackware 12, Fedora Core, PCLinuxOS
Posts: 194
Rep:
|
For Linux, a bash script that is run nightly by a cron job. The script tars the necessary directories then copies them to a network share on another machine, which is backed up by tape.
|
|
|
03-15-2007, 03:07 PM
|
#5
|
Member
Registered: Nov 2004
Location: Horseheads, New York
Distribution: Mandriva 2010.1 / KDE 4.5.2, Slax, Knoppix, Backtrack & etc...
Posts: 198
Rep:
|
I run an hourly cron job that calls a script. The script uses rsync over ssh to do an incremental backup to a partition (on a remote machine) dedicated to this purpose. The script can be written to rename the last backup and create a new one, if you want. But this will consume more bandwidth. If you wanted to do it with "rolling" backups, you'd only want to run it perhaps once daily, at night when the bandwidth won't be missed.
I've been using rsync to keep three complete copies of all my data, I have never lost anything. In fact I haven't lost any data since I went 100% Linux some 8 years ago.
One of the copies (all automated mind you) goes onto a USB removable hard drive, so if (God forbid) there be a fire or other emergency, I can grab the removable and run.
I have enjoy serious data security peace of mind, all thanks to rsync and ssh. It's elegantly simple to do.
|
|
|
03-15-2007, 03:36 PM
|
#6
|
Senior Member
Registered: Jul 2005
Distribution: Slackware
Posts: 1,274
Rep:
|
I, too, use dar run from a bash script. My script is setup to read a parameter to determine just what I want to backup at the moment - either the whole system or just my home directory.
I had problems using dd and tar in the past because if they ran into a problem reading a disk sector they would stop. Dar will skip the file, print an error on the screen and keep going.
|
|
|
All times are GMT -5. The time now is 07:17 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|