Quote:
Originally posted by imagirlgeek
Hi Tink,
Thanks ... I'll give some background.
I am at a small company that hosts maybe 6 or 8 client websites and log files ... the log files get very large in time. I have been instructed to do nightly backups of our server, which I place on a secondary backup drive - but there's only room for about 10 .gz files before the disk space is completely used up.
|
Are you using logrotate to break the logs down
into chunks of a more manageable size?
Quote:
We have a machine here in the office on which I can store archived backups, but I learned pretty quickly that ftp-ing a 2gig file takes several hours!
|
What type of server hardware are you writing to,
what kind of connection are we talking here? On a
100MBit full duplex I'm getting a throughput of around
9.7MB/s ... that should make around 4 minutes for 2 Gigs.
Quote:
I thought that if I could do a weekly full backup, and then incremental backups for the four other days, that might be more efficient. I'd be able to download the backups more quickly to the "archive" machine and I'd be able to store more on the secondary server.
I hope that makes sense. Thanks!
|
Does make sense, question is how fast a disaster-
recovery would be possible. Have you got a clone
of the base installation somewhere on a spare HDD
or something?
Cheers,
Tink