LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 08-25-2004, 02:20 PM   #1
imagirlgeek
LQ Newbie
 
Registered: Aug 2004
Posts: 8

Rep: Reputation: 0
Are incremental backups possible/safe?


Hello -

I need to conserve space on my backup drive .... are incremental backups safe, or are full backups my best bet?

If I'm better off using incremental backups, what syntax should I be using?
For full, I'm currently using the command:

tar - czvf /directory/to/backup/to/backup.tgz /directory/to/backup

Any advice would be greatly appreciated! I am sort of under the gun. Thanks!!
 
Old 08-25-2004, 03:17 PM   #2
Tinkster
Moderator
 
Registered: Apr 2002
Location: earth
Distribution: slackware by choice, others too :} ... android.
Posts: 23,067
Blog Entries: 11

Rep: Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928
Hi, and welcome to LQ!

The question is not which is better, the questions is
what are you trying to achieve? :)


Cheers,
Tink
 
Old 08-25-2004, 03:27 PM   #3
imagirlgeek
LQ Newbie
 
Registered: Aug 2004
Posts: 8

Original Poster
Rep: Reputation: 0
Hi Tink,

Thanks ... I'll give some background.

I am at a small company that hosts maybe 6 or 8 client websites and log files ... the log files get very large in time. I have been instructed to do nightly backups of our server, which I place on a secondary backup drive - but there's only room for about 10 .gz files before the disk space is completely used up.

We have a machine here in the office on which I can store archived backups, but I learned pretty quickly that ftp-ing a 2gig file takes several hours! I thought that if I could do a weekly full backup, and then incremental backups for the four other days, that might be more efficient. I'd be able to download the backups more quickly to the "archive" machine and I'd be able to store more on the secondary server.

I hope that makes sense. Thanks!
 
Old 08-25-2004, 04:09 PM   #4
Tinkster
Moderator
 
Registered: Apr 2002
Location: earth
Distribution: slackware by choice, others too :} ... android.
Posts: 23,067
Blog Entries: 11

Rep: Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928Reputation: 928
Quote:
Originally posted by imagirlgeek
Hi Tink,

Thanks ... I'll give some background.

I am at a small company that hosts maybe 6 or 8 client websites and log files ... the log files get very large in time. I have been instructed to do nightly backups of our server, which I place on a secondary backup drive - but there's only room for about 10 .gz files before the disk space is completely used up.
Are you using logrotate to break the logs down
into chunks of a more manageable size?

Quote:
We have a machine here in the office on which I can store archived backups, but I learned pretty quickly that ftp-ing a 2gig file takes several hours!
What type of server hardware are you writing to,
what kind of connection are we talking here? On a
100MBit full duplex I'm getting a throughput of around
9.7MB/s ... that should make around 4 minutes for 2 Gigs.


Quote:
I thought that if I could do a weekly full backup, and then incremental backups for the four other days, that might be more efficient. I'd be able to download the backups more quickly to the "archive" machine and I'd be able to store more on the secondary server.

I hope that makes sense. Thanks!
Does make sense, question is how fast a disaster-
recovery would be possible. Have you got a clone
of the base installation somewhere on a spare HDD
or something?


Cheers,
Tink
 
Old 08-25-2004, 04:11 PM   #5
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,397

Rep: Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777Reputation: 2777
If you use rsync, it will automatically create or update (ie differences only) the tgt files as appropriate. It also does compression, which will speed things up across the lan, and you can also ask it to use ssh encryption if you eventually decide to send backups out of the office ... or just want them inaccessible to unauthorised people.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
incremental & differential backups to CD? SheldonPlankton Linux - General 0 06-30-2004 05:33 PM
incremental backup reaky Linux - Software 3 03-10-2004 03:02 PM
Rsync for incremental backups? Phaethar Linux - Software 3 12-04-2003 01:27 PM
Incremental install funtoos Linux - Software 14 08-03-2003 05:30 AM
Incremental tar backups Pepe Linux - General 5 03-18-2002 03:31 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 10:15 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration