LinuxQuestions.org
Register a domain and help support LQ
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices

Reply
 
Search this Thread
Old 03-13-2010, 11:54 PM   #1
LinuxLearn
LQ Newbie
 
Registered: Mar 2010
Posts: 5

Rep: Reputation: Disabled
Need Advice For: Secure and Automated Backups


Greetings Everyone!

I'm trying to find a secure way to backup files on my Prod Server to Backup Server. It must be automated, so I will need to run a command with cron which will login to Prod Server from Backup Server and backup data.

1. Do you think it would be secure enough to do this by creating an passwordless RSA private key on Backup Server and adding it's public key to authorized_hosts file on Prod Server? I can't think of a way to Automate this without having to enter any passwords without passwordless RSA key. Is there another. more secure way?

2. Should I create a special user for backup, which will only have read access to all files in the directory that I am backing up? If so, How can I run a check that this new backup user indeed has read access to ALL files in the folder that I intent to back up? How can I ensure the backup process will not skip files due to some permission problem?

3. I'm thinking of using rsnapshot tool, which uses rsync. Is there another, recommended way?

I'm a newbie, so please forgive me if some questions seem stupid.

Thank you.
 
Click here to see the post LQ members have rated as the most helpful post in this thread.
Old 03-14-2010, 12:11 AM   #2
troop
Member
 
Registered: Feb 2010
Distribution: gentoo, arch, fedora, freebsd
Posts: 379

Rep: Reputation: 96
1. It's secure enough.
2. Make backup only in the directory where the user is owner.
3. I would suggest http://www.amanda.org/
 
Old 03-14-2010, 12:30 AM   #3
catkin
LQ 5k Club
 
Registered: Dec 2008
Location: Tamil Nadu, India
Distribution: Servers: Debian Squeeze and Wheezy. Desktop: Slackware64 14.0. Netbook: Slackware 13.37
Posts: 8,531
Blog Entries: 27

Rep: Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176
Bacula can also meet your requirements.

There is much discussion on the 'net about the relative merits of Bacula and Amanda. After studying the discussions I chose Bacula because its Windows client and GUI are important for the users' situation, more important in this case than Amanda's relative ease of configuration and Bacula's co-reqs (an SQL database and graphical package(s) for the GUI).

If you are looking into Amanda it helps to know that Zmanda is the commercial offering and Amanda the less fully featured free version.
 
Old 03-14-2010, 09:17 PM   #4
choogendyk
Senior Member
 
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,189

Rep: Reputation: 105Reputation: 105
Ah. And I studied the discussions and chose Amanda.

See http://blogs.umass.edu/choogend/2007...-about-amanda/ for an explanation of why I made that choice and how I felt about it afterwards.

Zmanda is not the commercial offering. Zmanda is the company that provides commercial support and development contributions to Amanda. Amanda Enterprise is the commercial offering. From my perspective all the important developments are fed straight into the Community Edition of Amanda. I personally don't care about GUI's, and I care very much about simplicity, directness, and minimization of extra packages and overhead. Of course, anyone making this decision really needs to look at all sides and information for themselves and make their own decision according to their own needs.

Interestingly, in this particular case, what the OP requests would probably be ill served by either Amanda or Bacula.

I think the choices lined out in the original question are appropriate. By the way, you can find a very good howto for public key authentication at http://sial.org/howto/openssh/publickey-auth/.
 
Old 03-15-2010, 02:53 PM   #5
LinuxLearn
LQ Newbie
 
Registered: Mar 2010
Posts: 5

Original Poster
Rep: Reputation: Disabled
THe best way to do backups

I'm writing this for someone who may have similar goal in future, here's what I decided to do after a lot of research:

1. The initiator of the backup process will be the Production machine, which hosts all the files that need to be backed up. Lets Call it "Prod"

2. Prod will connect to Backup machine (lets call it Backup) and upload backup data to it. Connection will be done by Passphrase protected private RSA key on Prod and public key on Backup.

3. In order to have this process automated, I will install keychain on Prod to manage my RSA keys, so that cron can run the tasks and login to Backup with RSA keys and usu the passphrase stored securely in keychain. here's an article on how to accomplish passwordless login with secure rsa passphrased keys: http://www.ibm.com/developerworks/library/l-keyc2/

4. I will use rsnapshot tool, which uses rsync to do the backps. It's solid, and has all the features I require, so I have no need to look at alternatives (including bacula, amanda, etc...)

5. The only thing I need to keep an eye on is: If my server is restarted, I'd have to login once to set the keychain passphrase so that automated scripts can work automatically. For this I'll setup an email to be sent to me as soon as machine is rebooted - which will prompt me to login once and enter passphrase

6. This will backup the filesystem, but prior to running this backup process on this machine, I'll run mysqldmp to backup my databases as tar.gz files. so for example: mysql backed up to tar.gz at 9:45 PM and filesystem backed up at 10:00 PM


This takes care of all my backup needs, rsnapshot will keep backup snapshots of 30 days + 4 weely, a few monthly and few early. I will backup data to 2 separate offsite servers and never worry about loosing data again.

Hope this helps someone in future.
 
2 members found this post helpful.
Old 03-15-2010, 04:52 PM   #6
beadyallen
Member
 
Registered: Mar 2008
Location: UK
Distribution: Fedora, Gentoo
Posts: 209

Rep: Reputation: 36
Glad you've got a system worked out. However, I would caution against having your 'Prod' server performing the backup.

As you've pointed out, you need to keep the backup server's keys on the prod server. I'm assuming that the 'prod' server is more exposed than the backup servers (the backup servers should certainly be very secure, with strict access controls), so this would mean that if the prod server was ever hacked/compromised, you've potentially given the attacker access to the backups as well. IMHO it would be much better to have the backup servers connect to the prod server to run the backup. Your prod server shouldn't have any rights to access the backup server directly.

From a quick look at the rsnapshot web page it seems the above would be supported.

Also, if you were to use ssh's 'authorized_keys' file you can specify commands to run when the backup user logs in. For instance, you could dump a database before the backup begins, then rsnapshot could do the backup.

Good luck.
 
1 members found this post helpful.
Old 03-15-2010, 06:36 PM   #7
choogendyk
Senior Member
 
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,189

Rep: Reputation: 105Reputation: 105
I totally agree with beadyallen's comments.

My backup server is locked down so that it can only be accessed from the local private net from a handful of IP addresses. It has no services running on it aside from sshd. It initiates backups and accesses services on other servers as needed.

The authorized_keys are really locked down as well. This is described in detail in `man sshd`, and http://sial.org/howto/openssh/publickey-auth/, which I gave above, also gives examples of this. The authorized_key on each server is locked down so only the backup server can access it, it doesn't allow any kind of forwarding, and only one command (the backup command) is allowed.
 
1 members found this post helpful.
Old 03-15-2010, 10:36 PM   #8
catkin
LQ 5k Club
 
Registered: Dec 2008
Location: Tamil Nadu, India
Distribution: Servers: Debian Squeeze and Wheezy. Desktop: Slackware64 14.0. Netbook: Slackware 13.37
Posts: 8,531
Blog Entries: 27

Rep: Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176Reputation: 1176
Quote:
Originally Posted by LinuxLearn View Post
6. This will backup the filesystem, but prior to running this backup process on this machine, I'll run mysqldmp to backup my databases as tar.gz files. so for example: mysql backed up to tar.gz at 9:45 PM and filesystem backed up at 10:00 PM
How about doing both tasks in a single script? That way there is no danger of backing up the files before the database dump has finished.
 
Old 03-16-2010, 12:46 AM   #9
LinuxLearn
LQ Newbie
 
Registered: Mar 2010
Posts: 5

Original Poster
Rep: Reputation: Disabled
Quote:
How about doing both tasks in a single script? That way there is no danger of backing up the files before the database dump has finished.
catkin,

I need to run mysql dump command prior to backup, separately from backup because I have to backup files to second backup server too, and dont want to backup mysql twice.

beadyallen,

You made a very good point, I will probably do it as you suggested and have Backup servers initiate the backup. Specifying the commands definitely helps a lot with security, thanks for mentioning that.

choogendyk,

Thank you for a very good link, very useful information there. I also came across this: http://www.linux.com/archive/feed/61061 that should be considered

Thanks a lot guys
 
Old 03-16-2010, 07:43 PM   #10
chrism01
Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.5, Centos 5.10
Posts: 16,246

Rep: Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025Reputation: 2025
Catkin's pt re mysqldump remains valid; I've seen system backups fail (usually the restore failed, which is worse because its too late to fix) because they ASSUMED the mysqldump (or equiv) was completed. You need to know; so find some way to incorporate a check into your backup script.
 
Old 03-17-2010, 05:44 AM   #11
LinuxLearn
LQ Newbie
 
Registered: Mar 2010
Posts: 5

Original Poster
Rep: Reputation: Disabled
Quote:
Catkin's pt re mysqldump remains valid; I've seen system backups fail (usually the restore failed, which is worse because its too late to fix) because they ASSUMED the mysqldump (or equiv) was completed. You need to know; so find some way to incorporate a check into your backup script.
chrism01, will do. Thanks.
 
  


Reply

Tags
backup, rsa, rsnapshot, security


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
postgres daily/weekly/ automated backups sir-lancealot Linux - Server 1 05-13-2008 07:00 PM
Automated Backups of Windoze with rsync jimbo1954 General 3 02-26-2007 02:08 PM
Automated Backups Matir Linux - Software 2 06-28-2005 08:55 AM
Any automated scripts to run DVD Backups? bweiss Linux - Newbie 3 04-28-2005 09:52 AM


All times are GMT -5. The time now is 06:10 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration