LinuxQuestions.org
Support LQ: Use code LQ3 and save $3 on Domain Registration
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 03-01-2003, 11:57 PM   #1
doublefailure
Member
 
Registered: Mar 2002
Location: ma
Distribution: slackware
Posts: 747

Rep: Reputation: 30
recommended backup solution?


hello..
my harddisk got currupted and i ended up buying new computer
it is very painful lesson.
Back up!

i'm thinking of having additional hd and find some backup software to regularly backup from hd1 to hd2..

any suggestions on anything?

thank you..
 
Old 03-02-2003, 02:04 AM   #2
bulliver
Senior Member
 
Registered: Nov 2002
Location: Edmonton AB, Canada
Distribution: Gentoo x86_64; Gentoo PPC; FreeBSD; OS X 10.9.4
Posts: 3,760
Blog Entries: 4

Rep: Reputation: 78
Use cpio and cron
man cpio
man cron

I usually just use cdr's as they are so cheap, and I don't usually need to back up that much.

If you want automatic backup software check this link:
http://www.linuxarchives.com/backup.html

Last edited by bulliver; 03-02-2003 at 02:07 AM.
 
Old 03-02-2003, 02:08 AM   #3
rnturn
Senior Member
 
Registered: Jan 2003
Location: Illinois (Chicago area)
Distribution: Red Hat (8.0, RHEL5,6), CentOS, SuSE (10.x, 11.x, 12.2, 13.2), Solaris (8-10), Tru64, MacOS, Raspian
Posts: 1,108

Rep: Reputation: 64
Re: recommended backup solution?

Quote:
Originally posted by doublefailure
my harddisk got currupted and i ended up buying new computer
it is very painful lesson.
Ouch! What happened t the old computer?
Quote:
i'm thinking of having additional hd and find some backup software to regularly backup from hd1 to hd2..
Well if you're just going to install a second disk, you could partition it just as you did your primary drive and duplicate the filesystems on a nightly basis using something as simple (and free!) as tar or cpio. Check out the info page for cpio, especially the section on copying directory structures. Essentially you'd do:

cd fs_mnt_pt ; find . -depth -xdev -print0 | cpio --null -pvd /mnt/dup_fs_mnt_pt

Using tar, it'd be a command like:

cd fs_mnt_pt ; tar clf - . | ( cd /mnt/dup_fs_mnt_pt ; tar xf - )

to copy a directory structure as well. (I'm sure someone will spot a switch I forgot... :-) )

Backup strategies need to take into account the risk you're trying to reduce and why you might lose it: user error, hardware failure, etc. . For example, having your backup physically installed in the system you're trying to protect won't be much protection for losing data if there's a power surge that damages the system. Plus, using an internal drive as your backup medium won't help much if you realize that you screwed up a file yesterday and last night's backup just made a perfect copy of the corrupted file onto your secondary drive. Unless you want to shut down on a daily basis to install a new hard disk so you have a rotation of disks used for backups.

Is tape out of the question? It's not as fast as a disk-to-disk backup but if you can fit everything you need to backup onto a single tape, it's something you can kick off at the end of the day and remove the following morning. It can cost a bit up front (for hardware and a bit of media) to start backing up to tape. But so can lost data.

Good luck...
 
Old 03-02-2003, 04:08 AM   #4
whansard
Senior Member
 
Registered: Dec 2002
Location: Mosquitoville
Distribution: RH 6.2, Gen2, Knoppix,arch, bodhi, studio, suse, mint
Posts: 3,185

Rep: Reputation: 52
if the second hard drive is at least as big as the first,
you can
dd if=/dev/hda of=/dev/hdb
that will copy the drive
it will go much faster if you add bs=1M
to that line.
you can back up individual partitions with
cat /dev/hda1 | gzip > /mnt/wherever
then restore with
gzip -cd filename.tgz > /dev/hda1
you can make the free space on a partition more
compressible by
dd if=/dev/zero of=full,
then deleting the file when it fills up the free space on
the partition, then dd'ing.
you'll have to jump
another hoop if you want files
bigger than 2 gigs

copying a 40 gig drive on my machine with dd
takes about 35 minutes.
you can dd if=/dev/hda of=/dev/hdb bs=1M; poweroff
to have your machine shutdown when it's done copying.

copying all the files with tar or cp takes much longer
unless the drive is mostly empty.
 
Old 03-02-2003, 10:50 AM   #5
doublefailure
Member
 
Registered: Mar 2002
Location: ma
Distribution: slackware
Posts: 747

Original Poster
Rep: Reputation: 30
my old comp was laptop(800)..
so decided to buy new one(desktop)

thanks for replies..
how would i back up regularly, say once a week.
i'm just gonna back up /home and /usr/local/apache/
maybe some more directories.

/home will be in seperate partition but not apache and others..

i'd like to figure out on my own.. buy don't have computer to search =(.. this is my friend's comp

thanks..
 
Old 03-02-2003, 11:21 AM   #6
Dave Skywatcher
Member
 
Registered: Feb 2003
Distribution: Debian
Posts: 127

Rep: Reputation: 16
If you're going to back up frequently, the most cost-effective solution would be CD-RW's. The media are dirt-cheap compared with other solutions, and there's no risk of accidentally corrupting your data like you would have with a second hard drive (as long as it's good when you put it on there, of course). Just set cron to back up the appropriate directories once a week, with a prompt before it starts so you can make sure the correct disc is mounted in the drive.

You can reuse a quality CD-RW more than 10,000 times, so even one will last you for about 200 years of weekly backups. And, of course, you can use the drive for reading CDs and for CD-R burning as well.
 
Old 03-02-2003, 01:22 PM   #7
arnold
Member
 
Registered: Dec 2002
Posts: 226

Rep: Reputation: 30
With cron, twice a week, I save important, changed files to a partition on a 2nd hard drive, using similar to following:
/bin/tar --use-compress-program bzip2 -cvf $FILE $(SHOW)
where the "SHOW" script is similar to following:
find "HOMEDIRECTORY" -type f -mtime -30 |egrep -v '\.gz|\.tgz|\~$|\.swp|core$|core\.|\.gif$|\.jpg$|\.zip$|\.ZIP'

The egrep excludes "unworthy" larger files.

Every month/6-weeks, I save to CDR.
Also suggest rpm -qa>rpmfile and save rpmfile

A bit hairy, but reduces manual steps and has saved me.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Backup Solution Hexane Linux - General 6 01-23-2005 05:26 AM
I need the best backup solution dtournas Linux - Software 1 10-16-2004 07:11 AM
What is the best solution for backup? jml75 Linux - Hardware 2 07-25-2004 09:50 AM
Looking for a backup solution Seventh Linux - Software 2 04-27-2004 12:55 PM
Backup solution NSKL Linux - General 4 11-29-2002 07:12 AM


All times are GMT -5. The time now is 11:59 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration