LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 01-22-2010, 12:30 PM   #1
uncle-c
Member
 
Registered: Oct 2006
Location: The Ether
Distribution: Ubuntu 16.04.7 LTS, Kali, MX Linux with i3WM
Posts: 299

Rep: Reputation: 30
Backing up a "live system" using tar ( followed by regular incremental backups) ?


Hello all.
I was just wondering if this was possible ? Obviously one would have to exclude certain folders / directories but would the backup be possible if the system is up and running in its native "live" state ? Which directories could be excluded ? Does swap need to be turned off ?
Ideally I would like to make incremental backups on a separate partition of the same hard drive. I will endeavour to backup the MBR/ Partition table using dd.

Regards
C
 
Old 01-22-2010, 12:37 PM   #2
GrapefruiTgirl
LQ Guru
 
Registered: Dec 2006
Location: underground
Distribution: Slackware64
Posts: 7,594

Rep: Reputation: 556Reputation: 556Reputation: 556Reputation: 556Reputation: 556Reputation: 556
It is certainly possible, yes; some of this is what I do:

--backup from a live, running state, using a script started by cron
--ignore /proc and /sys and /dev as these can cause problems.
--I don't recurse into any mounted media that might be mounted (dvd, cd, or other partitions/drives besides / and /home)
--swap can stay mounted; it makes no difference that I'm aware, and I don't see why it should.

I don't backup incrementally, and presently I don't bother tarring either; instead, I do a backup per day, keeping 7 days of backups which get rotated each backup (oldest gets wiped, newest gets created).

Also, I don't do anything as far as the MBR or partition table, so if there's anything to be done there, someone else will have to comment

Sasha
 
Old 01-22-2010, 02:22 PM   #3
uncle-c
Member
 
Registered: Oct 2006
Location: The Ether
Distribution: Ubuntu 16.04.7 LTS, Kali, MX Linux with i3WM
Posts: 299

Original Poster
Rep: Reputation: 30
Thanks GGirl,
A quick question. If you do not tar do you rsync or just cp using the -p argument ? Or perhaps neither ?!

C

Last edited by uncle-c; 01-22-2010 at 02:23 PM.
 
Old 01-22-2010, 02:46 PM   #4
GrapefruiTgirl
LQ Guru
 
Registered: Dec 2006
Location: underground
Distribution: Slackware64
Posts: 7,594

Rep: Reputation: 556Reputation: 556Reputation: 556Reputation: 556Reputation: 556Reputation: 556
I used to use `cp` until I took the relatively small amount of time to learn how to use `rsync`, and now I use rsync. It's perfect for the job IMHO and requires a little less command-line-argument-hackery than `cp` did.
 
Old 01-22-2010, 04:35 PM   #5
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 21,125

Rep: Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120Reputation: 4120
A backup of a running system can never be considered valid, unless you can quiesce (and flush) all sub-systems that issue (write) I/O - for the entire duration of the backup.
Incrementals require valid timestamps on the disk copy. No good if the updated data still resides (only) in memory.

Sometimes you mightn't care - most of the time you won't even know. Bad, very bad.
 
Old 01-22-2010, 06:02 PM   #6
choogendyk
Senior Member
 
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197

Rep: Reputation: 105Reputation: 105
Technically, and correctly, you should take a system down to single user to get a reliable backup. However, in reality, many many systems have to be up and running 24/7, and backups just take too long to have a system down. So, . . . sysadmins have lots of practical ways of improving their odds.

Snapshots. Vary from OS to OS. I use fssnap in Solaris 9 and ZFS snapshots in Solaris 10. There are options for Linux as well. This reduces the window of time you are vulnerable to a file system changing underneath you while you are running backups. However, it is still vulnerable to issues of things being in memory and not flushed, or to databases being in flux, etc.

So, identify whatever systems you have that might be an issue -- say, for example, MySQL. Typically MySQL will have its own tools for backup. However, you can get away with just having the MyISAM structure backed up by another tool, like tar or ufsdump, if you flush the database, lock tables, pull your snapshot, and then unlock the tables. Then you can backup the snapshot, and you only had the database locked up for maybe a few seconds. Of course, that's only one example. You have to examine your system and see what the issues are.

I've only been bitten once or twice over more than 10 years with not having a viable backup in a disaster situation. I had a boot drive die, and the recovered system was not bootable (something had come out inconsistent because of backing up a live system, I presume). I booted off cdrom, did an upgrade install over the recovered system, and that patched things up while still keeping my preferences and configurations. I might have also tried a slightly older full backup and incrementals to get me up to date, but I decided on the spur of the moment that the upgrade would get me running more quickly. Knowing a variety of options helps in a crisis.
 
Old 01-22-2010, 07:13 PM   #7
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682
Look in Section 5.2 of the tar info manual, which explains the -g option to produce either incremental or differential backups. The difference is whether you reuse the .snar file from the full backup or use a working copy of the full backups .snar file.

Files created or modified after tar was started will not be backed up.
One strategy could be to create a full backup (level 0 backup) manually in single user mode, and incremental or differential backups in a cron job.

I would also run "fdisk -l" and "fdisk -lu" and print out the results. The latter uses 512 byte sectors which eliminates rounding error. This would allow you to use losetup with an offset to attach a loop device to the starting point of a partition. In the event that the MBR was damaged and not repairable, but you could access other parts of the disk, you could still mount the partition and recover the files.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
"tar -xzf" error: "tar.child died with signal 13" vlsi Linux - General 4 05-26-2009 02:04 AM
[SOLVED] problem booting from live cd "crc error system halted" jbarry36 Linux - Newbie 1 02-28-2009 04:51 PM
Tar won't correctly restore from incremental backups therealbeale Linux - Software 1 08-31-2006 10:02 PM
Using tar to make regular backups - pros and cons Donboy Linux - General 12 07-30-2004 07:40 PM
Incremental tar backups Pepe Linux - General 5 03-18-2002 03:31 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 09:50 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration