LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices



Reply
 
Search this Thread
Old 01-18-2005, 07:57 AM   #1
jlinkels
Senior Member
 
Registered: Oct 2003
Location: Bonaire
Distribution: Debian Wheezy/Jessie/Sid, Linux Mint DE
Posts: 4,236

Rep: Reputation: 545Reputation: 545Reputation: 545Reputation: 545Reputation: 545Reputation: 545
Using (s)tar for back-up to hard disk creating very large files


Hello all,

This week for the umpteenth time my tape streamer broke down. This time it was a Quantum 40 GB DLT. The tape was ruined, and I cannot make backups anymore on our company network.

In the past 10 years, I think I have gone thru 5 tape (professional) tape streamers. Not only does it take a lot of time and hassle to get another tape streamer running, the most important is that all data recovery depends on something flimsy and unreliable as a tape streamer.

I might consider to buy another streamer, but as a second resort (and a temporary solution) I want to put up a Linux machine with some large hard disks and perform backups of my primary file server over the network.

What I would do is implement a multi-level dump strategy, with one regular full backup, and several incremental backups after that, until the next full backup.

I think in that way I have covered off-line storage, and a strategy which let me recover damaged data for a week or two.

I plan to use the 'star' program for that purpose. 'Star' is a 'tar'-like program, but functionality and performance is improved. I know how to use it.

The question is: when I create the backup the first time, I would have to create and write a 40 GB tar file over the network to the backup box. Will that be possible, or does 'tar' choke in such sizes? I do this already at home with 10 GB disks, but then to tape. Is there any fundamental difference in writing to a hard disk as compared to writing to tape?

BTW, the primary file server is a Windoze box (I know, shame on me), and I would have to set up the Linux box as a Samba client. Backups will be initiated from the Linux box.

Any comments?

jlinkels
 
Old 01-18-2005, 08:15 AM   #2
whansard
Senior Member
 
Registered: Dec 2002
Location: Mosquitoville
Distribution: RH 6.2, Gen2, Knoppix, 98,2000 + various
Posts: 3,171

Rep: Reputation: 52
if you run into trouble with the file sizes, use split.
|split --bytes 650m - /1/hda.tgz-

this splits the output into 650 meg chuncks.
 
Old 01-18-2005, 09:37 AM   #3
Sepero
Member
 
Registered: Jul 2004
Location: Tampa, Florida, USA
Distribution: Ubuntu
Posts: 734
Blog Entries: 1

Rep: Reputation: 31
Re: Using (s)tar for back-up to hard disk creating very large files

Quote:
Originally posted by jlinkels
The question is: when I create the backup the first time, I would have to create and write a 40 GB tar file over the network to the backup box. Will that be possible, or does 'tar' choke in such sizes?
tar, was originally created for Tape-ARchiving. I seriously doubt you'll have any problems.
 
Old 10-25-2005, 09:55 PM   #4
jlinkels
Senior Member
 
Registered: Oct 2003
Location: Bonaire
Distribution: Debian Wheezy/Jessie/Sid, Linux Mint DE
Posts: 4,236

Original Poster
Rep: Reputation: 545Reputation: 545Reputation: 545Reputation: 545Reputation: 545Reputation: 545
Just to finish this matter.

This system now has been running ever since January 2005. I make a full backup every two weeks, alternatingly on two hard disks. Additionally, I make a incremental every day and write it to two alternating files on a third disk.

It is not a pefect back-up system, I can only restore from 1 or 2 days before, and then from 2 weeks or 4 weeks before.

In the mean time I have gotten a new tape streamer and use that one for weekly backups.

The system with the disks works very reliably and really is install-and-forget. For quite a number of times I have saved the day of one of our users after he had incidentally overwritten his file, or MS decided to write an empty file to disk instead of the full one.

Restore is easy, just enter the command to restore the file (which is a complicated command, but it still fits on one line ), forget, and some time later the file is back. It might take up to 2 hours for star to find the file from a 40 GB backup on a 1 GHz machine, that is the only drawback. The operation is very reliable however. My script writes a log file with all file names to disk, so it is easy to pick the correct file name. I use this more often than restore from tape.

Just wanted to let you know this, although it is trivial.

jlinkels
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
move files in tar.gz back to into original folders GUIPenguin Linux - General 1 05-09-2005 01:11 PM
Best way to back up LARGE log files? sdebiasio Linux - Newbie 5 08-16-2004 05:54 PM
USB External Hard Disk problem with large files nick1.41 Linux - Hardware 0 05-26-2004 08:33 PM
using a large hard disk on an old motherboard? caged Linux - Hardware 2 04-30-2004 01:01 PM
Redhat 7.2 freezes on large hard disk bruce_mckinnon Linux - Software 0 03-27-2002 06:55 PM


All times are GMT -5. The time now is 09:30 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration