LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Encrypt backups with GPG to multiple tapes (https://www.linuxquestions.org/questions/linux-software-2/encrypt-backups-with-gpg-to-multiple-tapes-764856/)

TBKDan 10-27-2009 03:13 PM

Encrypt backups with GPG to multiple tapes
 
Currently, I use tar to write my backups (ntbackup files) to a tape drive fed by an autoloader.

Ex:
Code:

tar -F /root/advancetape -cvf /dev/st0 *.bkf
(/root/advancetape just has the logic to advance to the next tape if there is one available or notify to swap the tapes out)

I was recently handed the requirement to encrypt our tape backups. I can easily encrypt the data with no problems using GPG. The problem I'm having is how do I write this to multiple tapes with the same logic that tar uses to advance the tapes once the current one is filled? I cannot write the encrypted file to disk first (2+TB). As far as I can tell, tar will not accept binary input from stdin (it's looking for file names). Any ideas? :(

SethsdadtheLinuxer 10-27-2009 03:17 PM

just tar the gpg file. since it can't be opened without the key, it's still secure. actually, tar Does accept stdin input (tar cv - ) If this is wrong, do a google on "pipe tar"

TBKDan 10-27-2009 03:38 PM

Quote:

Originally Posted by SethsdadtheLinuxer (Post 3734457)
just tar the gpg file. since it can't be opened without the key, it's still secure. actually, tar Does accept stdin input (tar cv - ) If this is wrong, do a google on "pipe tar"

You can pipe the binary output of tar (the tarball) to another application, and you can pipe a list of files to be tarred to tar, but you cannot pipe the data to be tarred to tar (unless there's a magical option that's not in the manpages).

Code:

~# tar -cvf - testdisk-6.11.3/ | gpg --batch --disable-mdc --symmetric --cipher-algo AES256 --passphrase-fd 3 3<$GPG_KEY  | tar -cvf test.tar -
gpg: WARNING: unsafe ownership on configuration file `/home/dan/.gnupg/gpg.conf'
testdisk-6.11.3/
testdisk-6.11.3/README
testdisk-6.11.3/COPYING
testdisk-6.11.3/AUTHORS
testdisk-6.11.3/ChangeLog
testdisk-6.11.3/linux/
testdisk-6.11.3/linux/photorec_static
tar: -: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors
gpg: [stdout]: write error: Broken pipe
gpg: DBG: deflate: iobuf_write failed
gpg: build_packet failed: file write error
gpg: [stdout]: write error: Broken pipe
gpg: DBG: deflate: iobuf_write failed
gpg: [stdout]: write error: Broken pipe
gpg: iobuf_flush failed on close: file write error
gpg: symmetric encryption of `[stdin]' failed: file write error


chrism01 10-27-2009 07:54 PM

How about a divide-n-conquer strategy; ie, tar/encrypt in batches, rather than all files in one?

TBKDan 10-27-2009 08:00 PM

Quote:

Originally Posted by chrism01 (Post 3734707)
How about a divide-n-conquer strategy; ie, tar/encrypt in batches, rather than all files in one?

The backup files vary in size and some of them are larger than the amount of free space I have on the volume (>.5TB). Unfortunately, just not feasible :(

chrism01 10-27-2009 08:15 PM

Sounds like this system is right on the edge anyway; you should always have enough space to move a few files around.
Explain to mgr how tricky a restore could be if you've got insufficient disk space.

Lateral thinking options:
Time to hit up the mgr for another disk. If you've got multiple servers, get a dedicated backup server; something basic, but with loads of disk space.

cpio:
not so commonly used these days, but can handle binary streams, tar etc, etc http://www.gnu.org/software/cpio/manual/cpio.html

TBKDan 10-27-2009 08:41 PM

Quote:

Originally Posted by chrism01 (Post 3734727)
Sounds like this system is right on the edge anyway; you should always have enough space to move a few files around.
Explain to mgr how tricky a restore could be if you've got insufficient disk space.

Lateral thinking options:
Time to hit up the mgr for another disk. If you've got multiple servers, get a dedicated backup server; something basic, but with loads of disk space.

cpio:
not so commonly used these days, but can handle binary streams, tar etc, etc http://www.gnu.org/software/cpio/manual/cpio.html

This is the dedicated backup system, and we do have space to move files around (~300GB), just not enough space given the situation (if we needed to restore from tape, we have other places we could put it in a pinch).

We have a SAN in the works, but I have to make do with what I have available at the moment.

I'll check out cpio tomorrow - hopefully it has support for multiple tapes :) Thanks for the input.

choogendyk 10-28-2009 07:21 AM

Quote:

Originally Posted by TBKDan (Post 3734756)
This is the dedicated backup system

But you're still doing things by hand (or with scripts?) with tar?

If you are dealing with multiple systems and a dedicated backup system and terrabytes of space, I would think it is way past time to adopt some backup software that supports that kind of thing and automates the process for you. If I were doing it, I would implement Amanda. But it sounds like you may have some other issues to resolve first, such as space and resources on your backup server compared to your backup needs. Amanda can handle the encryption as well as the other things you are dealing with; but, like anything, it needs adequate disk space to work with.

TBKDan 10-28-2009 08:19 AM

Quote:

Originally Posted by choogendyk (Post 3735233)
But you're still doing things by hand (or with scripts?) with tar?

If you are dealing with multiple systems and a dedicated backup system and terrabytes of space, I would think it is way past time to adopt some backup software that supports that kind of thing and automates the process for you. If I were doing it, I would implement Amanda. But it sounds like you may have some other issues to resolve first, such as space and resources on your backup server compared to your backup needs. Amanda can handle the encryption as well as the other things you are dealing with; but, like anything, it needs adequate disk space to work with.

We have adequate disk space for the task at hand - I still have free space on the backup volume. It just happens that two of the backups are larger than the free space available, making encrypting to disk first not a feasible option. Right now, we have all of our windows servers use ntbackup to back themselves up to a Samba share on this Ubuntu backup server. Once the backups are complete, I have a tar script that writes the files to the tape drive, advancing the tapes as required. This server *used* to be Windows with BackupExec, but it performed like crap and BackupExec was buggy so I wrote an in-house application to control when the backups run and scripted this tar command to write the files to disk. Normally, the only human intervention is to change the tapes when it emails me to.

I had glanced at Amanda but it seemed a bit overkill to do what I was looking for - I was hoping there would be a relatively simple modification to what I already have working.

chrism01 10-29-2009 12:05 AM

Well, the 3 computing limits are cpu, memory, disk. In your case disk is the limiting factor, and you can't trade-off using the other 2. You must always have enough diskspace to do a recover (plus temp working space), but it sounds like you don't for at least 2 files. IME, this will only get worse; systems always get bigger, never smaller.

From your 1st post, this is a long term requirement, not a quick one-off, so, imho, you're going to have to explain to your mgr you need more disk if he wants to go ahead with gpg. It's the price of admission.

I agree with choogendyk that it's worth seriously looking at one integrated product eg Amanda, which can do all that for you, inc Linux, MS & Mac.
There's a HOWTO set up a Linux+MS backup in 15 mins using Amanda here http://www.zmanda.com/quick-backup-setup.html

TBKDan 10-29-2009 09:10 AM

Quote:

Originally Posted by chrism01 (Post 3736137)
Well, the 3 computing limits are cpu, memory, disk. In your case disk is the limiting factor, and you can't trade-off using the other 2. You must always have enough diskspace to do a recover (plus temp working space), but it sounds like you don't for at least 2 files. IME, this will only get worse; systems always get bigger, never smaller.

From your 1st post, this is a long term requirement, not a quick one-off, so, imho, you're going to have to explain to your mgr you need more disk if he wants to go ahead with gpg. It's the price of admission.

I agree with choogendyk that it's worth seriously looking at one integrated product eg Amanda, which can do all that for you, inc Linux, MS & Mac.
There's a HOWTO set up a Linux+MS backup in 15 mins using Amanda here http://www.zmanda.com/quick-backup-setup.html

The system is already maxed out on drives and we're not going to invest in a new system when we have a SAN on the way (see above). It works fine and I would be amazed if we had >300GB additional data in the span of 6 months. If I need to restore, I have an additional ~3TB array on USB disks that is available. I don't use it for backups because it is too slow for four concurrent gigabit backups running to it. Like I said, I have ways of restoring and that is not my concern.

I have to use ntbackup because Amanda does not backup system state like ntbackup does (unless there has been a new development in the past few months). From what I saw before, it looks like Amanda can backup to tape with encryption but I was hoping I could keep it simple with something like tar. No need to over-complicate something if I don't have to.

chrism01 10-30-2009 01:09 AM

I guess until the SAN arrives, try cpio.
Do keep us updated.


All times are GMT -5. The time now is 02:50 AM.