Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
After some severe data loss I'm getting a little more concerned about the issue.
I do backups, but not regularly, and without much of a system. I know several ways to do it, but I'm trying to get input from some of you people in person.
I'm not so sure how to exclude some directories from being... for example packed into a tar file by a cron job (fist thing that comes to mind)...
Also I don't want them to be overwritten by the next backup right away. I need a weekly backup of about 30 gig - obviously only changes should be included), And I'm willing to sacrifice a 250 gig drive which I currently use for my static backups... ~100 gig free.
I use dar with a custom Bash script to do all my backups. Dar is capable of doing what I think you need. It will do incrementals and differentials based on previous backups as well as the ability to slice the backup into smaller files. You can also exclude directories and best of all it will also include file deletions in your differential backups based on a previous full backup. For instance, say you do a full backup. Then you end up deleting certain files. When you do a differential backup based on that full backup, dar will record that you deleted those files in the differential backup. In that way, you can restore your system to its exact state by first restoring your full backup, then applying the differential. I've restored my system countless times using dar and so far, never a problem. The bash script is set to run in cron in the wee hours of the morning when the system is up. Also, I have it change to runlevel 2 to reduce any system chatter while the backup is running. Make sure you trap any system signals in your Bash script just in case the system wants to shutdown in the middle of a backup. That way you can clean up before it goes down and not have a corrupted backup file. I could send you my bash script if you would like to look at it. Hope this helps.
I run an hourly cron job that calls a script. The script uses rsync over ssh to do an incremental backup to a partition (on a remote machine) dedicated to this purpose. The script can be written to rename the last backup and create a new one, if you want. But this will consume more bandwidth. If you wanted to do it with "rolling" backups, you'd only want to run it perhaps once daily, at night when the bandwidth won't be missed.
I've been using rsync to keep three complete copies of all my data, I have never lost anything. In fact I haven't lost any data since I went 100% Linux some 8 years ago.
One of the copies (all automated mind you) goes onto a USB removable hard drive, so if (God forbid) there be a fire or other emergency, I can grab the removable and run.
I have enjoy serious data security peace of mind, all thanks to rsync and ssh. It's elegantly simple to do.