LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices

Reply
 
Search this Thread
Old 01-05-2008, 11:21 AM   #1
legacyprog
Member
 
Registered: Feb 2004
Location: New Jersey, USA
Distribution: Ubuntu 11.10
Posts: 52

Rep: Reputation: 15
Question Ubuntu backup question, best practice?


I recently switched from Suse 10.1 to Kubuntu 7.1, went pretty well but I had used a scheduled YaST backup under Suse, I don't see an equivalent in Ubuntu. I researched here in LQ and found this command which I assume I could cron sched: "tar -cvlf slash.tar.gz /" and I see there are a number of high and low level apps.

So, on something as important as backup I'd like to know what you feel the "best practice" is. I don't want to find out I chose the wrong tool for the job months from now when it's too late.

To be specific, I'm talking about providing the ability to recover the *Linux system* from a hard disk crash or other software problem by using my last scheduled backup image. I have multiple computers but no spare hard drives to take a direct disk to disk copy every night. What I was doing under Suse was copying its backup over to another PCs drive (it was about 400 meg) which meant I had enough space to even keep a rolling list of 4 night's worth of backups.

My non-system personal data is a separate issue. I take care of all that across all my PCs using other schemes. However system backups I know it's important to capture the "state" properly.

Of course I don't really know if YaST backup was the right choice, but at least this time I chose to ask LQ folks first.

My skill level is high newb or low intermediate I guess. Thanks in advance for any advice or links to advice I may have missed in my research!
 
Old 01-05-2008, 02:38 PM   #2
jailbait
Guru
 
Registered: Feb 2003
Location: Blue Ridge Mountain
Distribution: Debian Wheezy, Debian Jessie
Posts: 7,560

Rep: Reputation: 182Reputation: 182
Packing your entire system into a tarball requires a lot of processing. You can greatly reduce the amount of time you spend if you do incremental backups, i.e. only back up the things that have changed since the last backup.

"What I was doing under Suse was copying its backup over to another PCs drive (it was about 400 meg) which meant I had enough space to even keep a rolling list of 4 night's worth of backups."

If you have a lot of backup space available then don't bother to compress with tar. Compressing takes more time than the actual copying. If you use cp instead of tar then the backup will run much faster.

Most restores are only for a few files. Restoring a few files from a normal file tree backup is much easier than restoring part of a tarball.

So I recommend that you backup using cp and using incremental backups. Something along the lines of:

cp -purl / /backup/tree

See:

man cp

-----------------------
Steve Stites

Last edited by jailbait; 01-05-2008 at 03:03 PM.
 
Old 01-05-2008, 02:42 PM   #3
hob
Senior Member
 
Registered: Mar 2004
Location: Wales, UK
Distribution: Debian, Ubuntu
Posts: 1,075

Rep: Reputation: 45
The short answer is that Linux system backup is so easy (just copy the files!) that many, many products exist, aimed at different needs, and there isn't a single best practice required.

This has ended up as a long post, so key points:

* Your Linux system isn't a single black box that has to be treated at a monolithic unit, so you can just do what works best for your situation.

* Most of the files on your system are actually disposable.

The vast majority of the files on the system are created from packages and never modified by the user. Package files are specialized Zip-like archives - you can open them up and copy individual files out if you want, as well as reinstalling with the package management tools. Ubuntu caches downloaded packages in /var/cache/apt/archives/ for convenience.

The equivalent of "system state" in the sense that Windows backup tools mean essentially consists of the files in /etc, and a list of packages. My laptop has no hand-modified files in /etc, and any packages that aren't installed by default are listed in a customization script, so I only now backup my home directory on that system.

A server does need backups of shared files and databases, using whatever system is appropriate to the disaster recovery model (which may be very simple!). You should also backup /var/log/, unless your servers log to a central host. On a server, /etc/ may also be controlled by a automated network management tool, in which case the files in that directory also become disposable.
 
Old 01-05-2008, 03:43 PM   #4
legacyprog
Member
 
Registered: Feb 2004
Location: New Jersey, USA
Distribution: Ubuntu 11.10
Posts: 52

Original Poster
Rep: Reputation: 15
I really appreciate the education hob and jailbait...the two main things I'm taking away from your comments are as follows (please correct me if I went off track):
#1 - in a restore situation I would recover my system first, as if from an empty hard drive and then recover from *specific* portions of the Linux filesystem which I backed up. I did *not* realize that most of the files in Linux were essentially disposable, that's a big help to know. And yes I have plenty of disk space, and based on the few times I had to look at Suse YaST tar.gz files it was a real pain. So I will now switch to cp. As suggested I reviewed man cp and I see that -purl means an incremental backup of a tree preserving ownership and following links.
#2 - if I understood what I read, and if it is OK to do a little more than the minimum required since I have the space, then it sounds like I could copy the following trees with individual cp commands: /var, /etc, /home. In addition to whatever system files you had in mind, I know that I particularly care about /etc/samba and /var/www plus /home, which (of course) your advice includes.
Thanks again for the time it took to reply, I hope you both have a great weekend!
 
Old 01-05-2008, 04:13 PM   #5
legacyprog
Member
 
Registered: Feb 2004
Location: New Jersey, USA
Distribution: Ubuntu 11.10
Posts: 52

Original Poster
Rep: Reputation: 15
Follow-up for the benefit of anyone who stumbles on this thread later: it seemed I had to use cp -purL (notice the upper case L) but once I did that it went great...I copied my /etc, /var and /home trees with three cp commands in a bash script I setup in my root crontab for early in the AM when I'm asleep (tested it now though). It copied 1.5 gig (about 15 thousand files) in about 1 minute and since they're not tar'ed I can easily browse the entire backup structure.

Thanks again for the help! LQ rules.
 
Old 01-05-2008, 04:39 PM   #6
hob
Senior Member
 
Registered: Mar 2004
Location: Wales, UK
Distribution: Debian, Ubuntu
Posts: 1,075

Rep: Reputation: 45
Quote:
Originally Posted by legacyprog View Post
Follow-up for the benefit of anyone who stumbles on this thread later: it seemed I had to use cp -purL (notice the upper case L) but once I did that it went great...I copied my /etc, /var and /home trees with three cp commands in a bash script I setup in my root crontab for early in the AM when I'm asleep (tested it now though). It copied 1.5 gig (about 15 thousand files) in about 1 minute and since they're not tar'ed I can easily browse the entire backup structure.
It doesn't matter too much here, but it's often best to select directories from within /var, rather than taking the whole thing. If you switch /var for /var/log and the Debian/Ubuntu specific /var/backup you will probably reduce the space used.

The /var directory contains files generated or modified by the normal working of the system, so as well as logs (/var/log), it contains cached copies of downloaded packages (/var/cache/apt/archives), and the files for any database services. The last two can be pretty large, and databases are the one type of thing that you can't recover by just putting the files back, so grabbing those in a file backup can use up a comparatively large amount of space for little gain.

Last edited by hob; 01-05-2008 at 04:55 PM.
 
Old 01-05-2008, 07:09 PM   #7
legacyprog
Member
 
Registered: Feb 2004
Location: New Jersey, USA
Distribution: Ubuntu 11.10
Posts: 52

Original Poster
Rep: Reputation: 15
Thanks Hob, I will work on "pruning" the cp of the /var tree down a bit. Also I just discovered, by using the File Size View of Konqueror that the reason my backup of /home was almost a gig (!) was because there were a couple hidden .folders in there that were each about 400 meg. and were old November 2007 backups created by YaST under Suse...my old /home must have come over from Suse. I deleted those two old backups which, of course, will drastically reduce the size of my overall backup tonight.

Once again, thanks!

Last edited by legacyprog; 01-05-2008 at 07:10 PM.
 
  


Reply

Tags
advice, backup, ubuntu710


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Easy Ubuntu 6.06 (Dapper)Full Backup Solution... pazz33 Linux - Newbie 4 06-14-2007 07:05 PM
LXer: Backup and Restore Ubuntu System using Sbackup LXer Syndicated Linux News 0 03-21-2007 04:01 PM
Best way to backup Ubuntu? sn0w Ubuntu 4 03-20-2007 07:12 AM
Ubuntu 6.06 - Backup server ideas The slayer Ubuntu 4 06-17-2006 05:42 AM
Best Practice Question zuessh Linux - Security 3 05-02-2003 11:46 AM


All times are GMT -5. The time now is 07:55 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration