LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 10-15-2013, 08:49 PM   #1
OtagoHarbour
Member
 
Registered: Oct 2011
Posts: 332

Rep: Reputation: 3
Best way to save an image of a Linux server system


I have two servers (one running Ubuntu 11.4 and the other running Debian 7.2) that I have set up as a DMZ network to host a web site. I have put a considerable amount of time into tweaking the servers so that they talk to each other properly and have the right security settings. I am hoping to save images of the servers in case (or more likely for when) the computer (HD or MB) goes bad. I would like to use freeware if possible.

One way I saw is to use SystemImager which seems good except it appears I need a separate "image server" to save the image of each server. Is there a way to do this without having to invest in another server to back up the image of each server?

Thanks,
OH.
 
Old 10-15-2013, 09:29 PM   #2
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,980

Rep: Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624
There are 20 or so good ways.

Some ideas include the package management clone apps in ubuntu and maybe debian.

Any of the clone tools from Gparted/partimage/clonzilla/redobackup/(forget this one like mondobackup or such?)or even G4U.

Some folks live with tar and cpio and dd piped to gz.

You'd need some storage someplace. You might be able to set up a ftp site or nfs site on each that allows you to backup one to the other. Be sure that you consider a separate partition so you don't get each backup in the clone.
 
1 members found this post helpful.
Old 10-16-2013, 08:48 AM   #3
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 26,634

Rep: Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965
Quote:
Originally Posted by OtagoHarbour View Post
I have two servers (one running Ubuntu 11.4 and the other running Debian 7.2) that I have set up as a DMZ network to host a web site. I have put a considerable amount of time into tweaking the servers so that they talk to each other properly and have the right security settings. I am hoping to save images of the servers in case (or more likely for when) the computer (HD or MB) goes bad. I would like to use freeware if possible.

One way I saw is to use SystemImager which seems good except it appears I need a separate "image server" to save the image of each server. Is there a way to do this without having to invest in another server to back up the image of each server?
Well, the point of doing backups is to store your data in another location, so that if the server dies, you can access the data. This is no different, is it?? Kind of pointless to keep the restore image on a server that you can't access because it died. You don't NEED a separate image server for systemimager, anymore than you do for mondoarchive...unless you want to do network booting. If your recovery plan is to burn the ISO onto DVD(s), then you're all set.

Since the images aren't going to be huge, and disk space is so cheap, consider putting the Ubuntu image onto the Debian server, and vice-versa. NFS is a great way to do this. If you take regular image snapshots, just overwrite the image each time.
 
1 members found this post helpful.
Old 10-27-2013, 02:25 PM   #4
OtagoHarbour
Member
 
Registered: Oct 2011
Posts: 332

Original Poster
Rep: Reputation: 3
Quote:
Originally Posted by jefro View Post
There are 20 or so good ways.

Some ideas include the package management clone apps in ubuntu and maybe debian.

Any of the clone tools from Gparted/partimage/clonzilla/redobackup/(forget this one like mondobackup or such?)or even G4U.

Some folks live with tar and cpio and dd piped to gz.

You'd need some storage someplace. You might be able to set up a ftp site or nfs site on each that allows you to backup one to the other. Be sure that you consider a separate partition so you don't get each backup in the clone.
Thank you for your reply. I'm leaning towards just using tar and gzip since I have already had experience with them when I have downloaded programs. It seems like the simplest approach and therefore the approach that is least likely to have me causing problems by doing something wrong. My / partition is only 6% full so I could maybe just back everything up into a tar ball. One thing I did read about tar is that it doesn't backup special files. Would cpoi, dd or GParted get around this problem? I was wondering if this would be a problem.

For the most part I am mainly interested in saving my firewall, ssh and apache settings. I also have the servers talking to each other through a DMZ.

Thanks,
OH.

Last edited by OtagoHarbour; 10-27-2013 at 02:26 PM.
 
Old 10-27-2013, 02:39 PM   #5
OtagoHarbour
Member
 
Registered: Oct 2011
Posts: 332

Original Poster
Rep: Reputation: 3
Quote:
Originally Posted by TB0ne View Post
Well, the point of doing backups is to store your data in another location, so that if the server dies, you can access the data. This is no different, is it?? Kind of pointless to keep the restore image on a server that you can't access because it died. You don't NEED a separate image server for systemimager, anymore than you do for mondoarchive...unless you want to do network booting. If your recovery plan is to burn the ISO onto DVD(s), then you're all set.

Since the images aren't going to be huge, and disk space is so cheap, consider putting the Ubuntu image onto the Debian server, and vice-versa. NFS is a great way to do this. If you take regular image snapshots, just overwrite the image each time.
I guess I didn't explain myself very well. I was more thinking in terms of whether I needed to back the image up on a server as opposed to an external HD. Backing up the server on each other seems like a great idea but would an external HD not work as well?

Thanks,
OH.
 
Old 10-27-2013, 02:56 PM   #6
Robhogg
Member
 
Registered: Sep 2004
Location: Old York, North Yorks.
Distribution: Debian 7 (mainly)
Posts: 653

Rep: Reputation: 97
Clonezilla is worth looking at. There is a live CD version, and it can create images of partitions or whole disks on a removable drive.
 
1 members found this post helpful.
Old 10-28-2013, 09:33 AM   #7
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 26,634

Rep: Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965
Quote:
Originally Posted by OtagoHarbour View Post
I guess I didn't explain myself very well. I was more thinking in terms of whether I needed to back the image up on a server as opposed to an external HD. Backing up the server on each other seems like a great idea but would an external HD not work as well?
Why would it matter?? Your server sees a storage device as just that; a storage device. Doesn't matter if you have a thumb drive, external hard drive, or a hard-wired RAID cabinet or SAN. If it's mounted with a valid file system (in RW mode), you can write to it. Eject it as is suitable for whatever type of medium it is, and take it away.

I will say that it is far faster to create the image onto a local disk. Read/write times on a SATA 6 drive will be far faster than a USB external hard drive, no matter WHAT is writing to it. You can always CRON the thing too, and have the image created in the wee hours of the morning, copied to external media (and VERIFIED/checksummed), then remove the large image from local disk.
 
1 members found this post helpful.
Old 10-28-2013, 08:36 PM   #8
OtagoHarbour
Member
 
Registered: Oct 2011
Posts: 332

Original Poster
Rep: Reputation: 3
Quote:
Originally Posted by Robhogg View Post
Clonezilla is worth looking at. There is a live CD version, and it can create images of partitions or whole disks on a removable drive.
Thank you for your reply. I have backed up the system using the method outlined here. Do you think Clonezilla has advantages over that?

Thanks,
OH.
 
Old 10-28-2013, 08:46 PM   #9
OtagoHarbour
Member
 
Registered: Oct 2011
Posts: 332

Original Poster
Rep: Reputation: 3
Quote:
Originally Posted by TB0ne View Post
Why would it matter?? Your server sees a storage device as just that; a storage device. Doesn't matter if you have a thumb drive, external hard drive, or a hard-wired RAID cabinet or SAN. If it's mounted with a valid file system (in RW mode), you can write to it. Eject it as is suitable for whatever type of medium it is, and take it away.

I will say that it is far faster to create the image onto a local disk. Read/write times on a SATA 6 drive will be far faster than a USB external hard drive, no matter WHAT is writing to it. You can always CRON the thing too, and have the image created in the wee hours of the morning, copied to external media (and VERIFIED/checksummed), then remove the large image from local disk.
Thank you for your reply. I went ahead and saved an image of the system using the method outlined here. In order to copy the file onto my external HD, I needed to split the file using

Code:
split -C 2G  whatever.tar.bz2
I could use inotify to start a cron job to back the files up nightly but I was not thinking of backing the whole system up on a regular basis. It's more to save the firewall, ssh, and /etc configurations in case the disk goes bad. I back up new code as I write it.

Thanks,
OH.
 
Old 10-29-2013, 02:37 AM   #10
ericson007
Member
 
Registered: Sep 2004
Location: Japan
Distribution: CentOS 7.1
Posts: 735

Rep: Reputation: 154Reputation: 154
Well depending how critical the aystem is and your space constraints etc... i usually op for saving the web root folder, config scripts and database files from /var/lib

Saves time, space and bandwidth.

If crap hits the fan, just a quick minimal install with select packages from a kickstart file and copying the files back and a selinux relable.
 
1 members found this post helpful.
Old 10-29-2013, 04:23 PM   #11
Robhogg
Member
 
Registered: Sep 2004
Location: Old York, North Yorks.
Distribution: Debian 7 (mainly)
Posts: 653

Rep: Reputation: 97
Quote:
Originally Posted by OtagoHarbour View Post
Do you think Clonezilla has advantages over that?
That depends on exactly what you want. It's designed to create disk (or partition) images, which means it's easy to carry out a "bare metal" restore. You can boot the server from the Clonezilla live CD, and restore to a binary-identical state in a single step (including boot sector, partition table and filesystems). On the other hand, if all you need is to restore a single file, that won't be so easy.
 
1 members found this post helpful.
Old 10-31-2013, 01:35 PM   #12
OtagoHarbour
Member
 
Registered: Oct 2011
Posts: 332

Original Poster
Rep: Reputation: 3
Quote:
Originally Posted by Robhogg View Post
That depends on exactly what you want. It's designed to create disk (or partition) images, which means it's easy to carry out a "bare metal" restore. You can boot the server from the Clonezilla live CD, and restore to a binary-identical state in a single step (including boot sector, partition table and filesystems). On the other hand, if all you need is to restore a single file, that won't be so easy.
I burned an ISO CD for Clonezilla Live. However it looks like I need to reboot my system with the ISO disk. It seems a tar ball is less risky since every time I boot my server up it is quite a production setting up the network connectivity. (I use the server to host a web site.) Maybe I will try it next time the power goes down and I need to boot up the system.

Thanks,
OH.
 
Old 10-31-2013, 01:39 PM   #13
OtagoHarbour
Member
 
Registered: Oct 2011
Posts: 332

Original Poster
Rep: Reputation: 3
Quote:
Originally Posted by ericson007 View Post
Well depending how critical the aystem is and your space constraints etc... i usually op for saving the web root folder, config scripts and database files from /var/lib

Saves time, space and bandwidth.

If crap hits the fan, just a quick minimal install with select packages from a kickstart file and copying the files back and a selinux relable.
Thanks. I've made a tar ball of everything except a few directories with files I can download (like OpenCV, Eclipse etc). I did have to split the resulting file to put it on a disk but I understand that one can recover the .tar.bz2 file with cat.

Thanks,
OH.
 
Old 10-31-2013, 01:48 PM   #14
szboardstretcher
Senior Member
 
Registered: Aug 2006
Location: Detroit, MI
Distribution: GNU/Linux systemd
Posts: 4,278

Rep: Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694Reputation: 1694
Many good suggestions in this thread.

When making web servers, I generally create a small cluster (web1, web2, web3) which just has apache and the vhost configuration on them. So there isnt a reason to back them up because restoring from a backup will take longer than provisioning a new server. In addition to that I create a replicated NFS group (storage1, storage2) that all of the web servers connect to. When the model is set up this way, you have a failover storage space, And it is easy to set up an Rsync backup of.

Also, in front of the web nodes I have a load balancer that round robins the requests.

Using a combination of Rundeck, Puppet, Chef or whatever you like, you can automate the setup and deployment of additional web and storage nodes to add to the pool.
 
1 members found this post helpful.
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Ubuntu 12.04 server. Residual hardware information in system image preventing boot. onufry Ubuntu 3 12-18-2012 12:24 AM
system image of Intel server RAID1 m4rtin Linux - Server 1 08-04-2011 05:33 AM
How do I tell Linux to save more than a few days of system log? martinr Fedora 3 08-14-2009 08:29 AM
save important data with a linux-live system joethegeek Linux - General 6 09-20-2006 07:24 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 12:52 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration