LinuxQuestions.org
Latest LQ Deal: Linux Power User Bundle
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 09-05-2003, 08:20 PM   #1
Electrode
Member
 
Registered: Oct 2002
Location: Michigan
Distribution: Slackware, LFS, Gentoo
Posts: 158

Rep: Reputation: 16
Best way to back up about 200 GB of files to DVDs?


Recently, I decided to buy a DVD-RW drive to take the place of the CD-RW in my NAS box. It appears to be working as it should. I found out that cdrecord doesn't work with DVDs so I set up dvdrtools, a fork of cdrtools with DVD writing support.

Anyway, what I would like to know is, what would be the best way to back up the contents of the NAS box, about 200 GB? It is a nearly impossible task to manually go through thousands of directories, adding up 4.7 GB chunks and running them through mkisofs. I tried using a cdrtools frontend, "CD Bake Oven", but it's size display becomes inaccurate around the 4 GB mark, making it useless. I would be open to suggestions for other such frontends.

The easiest way I can think of is just making a spanned tar, but that would require me to spend a few hours swapping disks just to get to a single file. Not something I want to do.

Suggest away.
 
Old 09-05-2003, 09:27 PM   #2
RolledOat
Member
 
Registered: Feb 2003
Location: San Antonio
Distribution: Suse 9.0 Professional
Posts: 843

Rep: Reputation: 30
Hmmm, interesting problem. Here's a thought. Navigate to the top of
the dir to be backed up.

du -k > verylargelist
Import it into openoffice. It is in the format
Size<spaces>dirname.
OpenOffice can import into a spreadsheet using fixed format to get the size and dirname into two columns.

I would then insert a column before the first line, and have a formula copied into every two in the format

A B C
1 0+B1 size 3 /Dirname
2 A1+ B2 size 3 /Dirname
3 A2 + B3 Size3 /Dirname

You could then, and this is going to be one mother of a spreadsheet, ,
but then go down the first column till you hit about 5Gbytes.
You can then export all the rows up to that point to a flat file.
Use the dirnames as an input into tar.

May take a while, but you get the index of all files to put on the DVD
in the form of the tar create script. I can't think of a better way to
back it all up and be able to extract the files as you needed them.

Well, if nothing else, maybe I gave you something else to think about.

RO
 
Old 09-06-2003, 04:42 PM   #3
Electrode
Member
 
Registered: Oct 2002
Location: Michigan
Distribution: Slackware, LFS, Gentoo
Posts: 158

Original Poster
Rep: Reputation: 16
Thanks for the suggestion RolledOat.

However, I have come upon another solution that seems to be working quite well.

First, make a directory on a filesystem whose free space is at least the total size of the data to be backed up + the size of the media.

Second, switch to the top directory of the tree to be backed up. Run:

tar -c -L size -vMf /path/to/workingdir/file.tar *

size is the size of the media in KB, minus a few megs.

When it prompts you to change the media, rename file.tar to something meaningful and then touch file.tar. Continue until done.

Now, you can burn these tars to optical media as is, or you can do as I am doing and make ISO-9660 filesystems which you can readilly access.

First, begin extracting the archive. This is done with tar xvMf file.tar, where file.tar is the name of the first part of the archive. When it prompts for the next part, kill it and note the file or directory it was working on. Delete that file or directory.

Now run mkisofs against what you have so far, then burn it. Delete the extracted files and the iso file you just created.

Next, run tar -c --starting-file file -vMf file.tar, where file is the file or directory that was being extracted when you killed tar the last time, and file.tar is, again, the first part of the archive. When it prompts to switch volumes, rename file.tar to something else and rename the second part of the archive to file.tar. When you get to the end of that archive, repeat the above process.

There you have it.

Last edited by Electrode; 09-07-2003 at 01:19 PM.
 
Old 09-06-2003, 05:58 PM   #4
Electro
LQ Guru
 
Registered: Jan 2002
Posts: 6,042

Rep: Reputation: Disabled
200 gigabytes is a lot even for tape backup. DVDs can only store 4 gigabytes (to be safe). 200 divide by 4 is around 50 DVDs. You can use an online data storage. I use streamload.com for some of my data although online data storage works best if you have a broadband connection. Its useless if you have 56 kbit modem. Using DVDs for backup you run a problem when reading them. Not many DVD readers can read DVD-R, DVD+R, DVD-RW, or DVD+RW. Also DVD drives are not very fast, you will be there for a few days backing up and restoring.

You can also use a hard drive since the prices are coming down. Maxtor 300 gigabyte 5400 RPM should be good enough. Buy a removable drive bay so you can make the backup process easier.

Tape drives or Hard drives are still better for your application.
 
Old 09-06-2003, 06:12 PM   #5
RolledOat
Member
 
Registered: Feb 2003
Location: San Antonio
Distribution: Suse 9.0 Professional
Posts: 843

Rep: Reputation: 30
What you stated seems like a good solution. About the only drawback is that you don't get any compression. If the end result is in tar.tgz, and Linux/Unix file manager can navigate through it. Unless the 200 gig is all binaries, etc, a fully compressed 200 gig tar would probably be ~120 Gig or so. Will your process work with while adding the .z file. That way, it will compress as it goes, you stop, move, extract, then retar with a new name with the .z option. You then just pop the tgz onto a dvd as a simple data DVD.

RO
 
Old 09-06-2003, 06:20 PM   #6
Electrode
Member
 
Registered: Oct 2002
Location: Michigan
Distribution: Slackware, LFS, Gentoo
Posts: 158

Original Poster
Rep: Reputation: 16
Electro: I was using a 250 GB Maxtor HD in a USB 2.0 external housing for backups, but I decided that it would be more useful as part of my storage array due to my growing need for perminant storage. (I'm using almost 500 megs a day now)

Using DVDs will allow me to use that drive and all the other hard drives I was using to back other things up for "active" storage and allow me to add another 250 gigs of backup and archive space for around $50 whenever it is needed.

As for the speed issue, I used to use a USB 1.1 hard drive housing.
 
Old 09-06-2003, 06:24 PM   #7
Electrode
Member
 
Registered: Oct 2002
Location: Michigan
Distribution: Slackware, LFS, Gentoo
Posts: 158

Original Poster
Rep: Reputation: 16
RolledOat: The data I am working with is almost 100% binary, much of it compressed. Therefore using the z or j flag in tar would do nothing but slow it down.
 
Old 09-07-2003, 03:28 AM   #8
Electro
LQ Guru
 
Registered: Jan 2002
Posts: 6,042

Rep: Reputation: Disabled
DVD-R disc doesn't last long. Before you buy the DVD-R, read the label on the DVDs to found out how long you should wait until replacing the DVD-R disc. I think gold version DVD-R last the longest but very costly.

I guess you can not wait for halographic storage to come to the public. It can store about one terabyte of storage of data in about a size of a sugar cube. I heard and read in the mid 1990s that they have a problem rewriting to the holgraphic storage. Soon it may come out.
 
Old 09-07-2003, 03:54 AM   #9
Mega Man X
LQ Guru
 
Registered: Apr 2003
Location: ~
Distribution: Ubuntu, FreeBSD, Solaris, DSL
Posts: 5,339

Rep: Reputation: 64
Unconnected but... does this also applies to CDr's? Because I've read that CDr can last as long as an ordinary CD, depending how you they were storage and protected. But it's still very controversial .

About your problem, the only think I can think is using another hard-drive, internal or external for this task. Burning 50 DVD's is painful and some marks are quite expensive . Hard drivers are cheaper now and always quite handy
 
Old 09-09-2003, 05:08 PM   #10
Electro
LQ Guru
 
Registered: Jan 2002
Posts: 6,042

Rep: Reputation: Disabled
CD-Rs and CD-RWs don't last as long as ordinary CDs. Read the label. Each manufacture is different.

In the old days when we didn't have CD-R, ZIP, JAZ, we had to use floopies or expensive tape. Many people had stacks (+100) of floppies spending a whole day swapping and verifying disks. DVDs will take longer.

A bunch of hard drives will be better than DVDs. You just have to park the heads before shutting down or removing them. Megnatic material can last much longer than CDs.

There are other mediums to use that are better than DVDs.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
back up files in mandrake frayed2 Mandriva 4 04-27-2005 10:14 PM
lose 200$ or sace 200! HELP HELP HELP! OMEGA-DOOM Linux - Software 8 10-23-2004 08:47 PM
How to get back my --linux-.--- files?? dufferin Slackware 2 06-28-2004 02:30 AM
Heavy Utilization of the CPU when playing back DVDs TheOneKEA Linux - Software 6 05-14-2004 11:23 AM
newbie? back-up OS/files perfect20002 Linux - Software 0 06-19-2003 09:37 AM


All times are GMT -5. The time now is 03:29 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration