Best way to save an image of a Linux server system
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Best way to save an image of a Linux server system
I have two servers (one running Ubuntu 11.4 and the other running Debian 7.2) that I have set up as a DMZ network to host a web site. I have put a considerable amount of time into tweaking the servers so that they talk to each other properly and have the right security settings. I am hoping to save images of the servers in case (or more likely for when) the computer (HD or MB) goes bad. I would like to use freeware if possible.
One way I saw is to use SystemImager which seems good except it appears I need a separate "image server" to save the image of each server. Is there a way to do this without having to invest in another server to back up the image of each server?
Some ideas include the package management clone apps in ubuntu and maybe debian.
Any of the clone tools from Gparted/partimage/clonzilla/redobackup/(forget this one like mondobackup or such?)or even G4U.
Some folks live with tar and cpio and dd piped to gz.
You'd need some storage someplace. You might be able to set up a ftp site or nfs site on each that allows you to backup one to the other. Be sure that you consider a separate partition so you don't get each backup in the clone.
I have two servers (one running Ubuntu 11.4 and the other running Debian 7.2) that I have set up as a DMZ network to host a web site. I have put a considerable amount of time into tweaking the servers so that they talk to each other properly and have the right security settings. I am hoping to save images of the servers in case (or more likely for when) the computer (HD or MB) goes bad. I would like to use freeware if possible.
One way I saw is to use SystemImager which seems good except it appears I need a separate "image server" to save the image of each server. Is there a way to do this without having to invest in another server to back up the image of each server?
Well, the point of doing backups is to store your data in another location, so that if the server dies, you can access the data. This is no different, is it?? Kind of pointless to keep the restore image on a server that you can't access because it died. You don't NEED a separate image server for systemimager, anymore than you do for mondoarchive...unless you want to do network booting. If your recovery plan is to burn the ISO onto DVD(s), then you're all set.
Since the images aren't going to be huge, and disk space is so cheap, consider putting the Ubuntu image onto the Debian server, and vice-versa. NFS is a great way to do this. If you take regular image snapshots, just overwrite the image each time.
Some ideas include the package management clone apps in ubuntu and maybe debian.
Any of the clone tools from Gparted/partimage/clonzilla/redobackup/(forget this one like mondobackup or such?)or even G4U.
Some folks live with tar and cpio and dd piped to gz.
You'd need some storage someplace. You might be able to set up a ftp site or nfs site on each that allows you to backup one to the other. Be sure that you consider a separate partition so you don't get each backup in the clone.
Thank you for your reply. I'm leaning towards just using tar and gzip since I have already had experience with them when I have downloaded programs. It seems like the simplest approach and therefore the approach that is least likely to have me causing problems by doing something wrong. My / partition is only 6% full so I could maybe just back everything up into a tar ball. One thing I did read about tar is that it doesn't backup special files. Would cpoi, dd or GParted get around this problem? I was wondering if this would be a problem.
For the most part I am mainly interested in saving my firewall, ssh and apache settings. I also have the servers talking to each other through a DMZ.
Thanks,
OH.
Last edited by OtagoHarbour; 10-27-2013 at 02:26 PM.
Well, the point of doing backups is to store your data in another location, so that if the server dies, you can access the data. This is no different, is it?? Kind of pointless to keep the restore image on a server that you can't access because it died. You don't NEED a separate image server for systemimager, anymore than you do for mondoarchive...unless you want to do network booting. If your recovery plan is to burn the ISO onto DVD(s), then you're all set.
Since the images aren't going to be huge, and disk space is so cheap, consider putting the Ubuntu image onto the Debian server, and vice-versa. NFS is a great way to do this. If you take regular image snapshots, just overwrite the image each time.
I guess I didn't explain myself very well. I was more thinking in terms of whether I needed to back the image up on a server as opposed to an external HD. Backing up the server on each other seems like a great idea but would an external HD not work as well?
I guess I didn't explain myself very well. I was more thinking in terms of whether I needed to back the image up on a server as opposed to an external HD. Backing up the server on each other seems like a great idea but would an external HD not work as well?
Why would it matter?? Your server sees a storage device as just that; a storage device. Doesn't matter if you have a thumb drive, external hard drive, or a hard-wired RAID cabinet or SAN. If it's mounted with a valid file system (in RW mode), you can write to it. Eject it as is suitable for whatever type of medium it is, and take it away.
I will say that it is far faster to create the image onto a local disk. Read/write times on a SATA 6 drive will be far faster than a USB external hard drive, no matter WHAT is writing to it. You can always CRON the thing too, and have the image created in the wee hours of the morning, copied to external media (and VERIFIED/checksummed), then remove the large image from local disk.
Why would it matter?? Your server sees a storage device as just that; a storage device. Doesn't matter if you have a thumb drive, external hard drive, or a hard-wired RAID cabinet or SAN. If it's mounted with a valid file system (in RW mode), you can write to it. Eject it as is suitable for whatever type of medium it is, and take it away.
I will say that it is far faster to create the image onto a local disk. Read/write times on a SATA 6 drive will be far faster than a USB external hard drive, no matter WHAT is writing to it. You can always CRON the thing too, and have the image created in the wee hours of the morning, copied to external media (and VERIFIED/checksummed), then remove the large image from local disk.
Thank you for your reply. I went ahead and saved an image of the system using the method outlined here. In order to copy the file onto my external HD, I needed to split the file using
Code:
split -C 2G whatever.tar.bz2
I could use inotify to start a cron job to back the files up nightly but I was not thinking of backing the whole system up on a regular basis. It's more to save the firewall, ssh, and /etc configurations in case the disk goes bad. I back up new code as I write it.
Well depending how critical the aystem is and your space constraints etc... i usually op for saving the web root folder, config scripts and database files from /var/lib
Saves time, space and bandwidth.
If crap hits the fan, just a quick minimal install with select packages from a kickstart file and copying the files back and a selinux relable.
That depends on exactly what you want. It's designed to create disk (or partition) images, which means it's easy to carry out a "bare metal" restore. You can boot the server from the Clonezilla live CD, and restore to a binary-identical state in a single step (including boot sector, partition table and filesystems). On the other hand, if all you need is to restore a single file, that won't be so easy.
That depends on exactly what you want. It's designed to create disk (or partition) images, which means it's easy to carry out a "bare metal" restore. You can boot the server from the Clonezilla live CD, and restore to a binary-identical state in a single step (including boot sector, partition table and filesystems). On the other hand, if all you need is to restore a single file, that won't be so easy.
I burned an ISO CD for Clonezilla Live. However it looks like I need to reboot my system with the ISO disk. It seems a tar ball is less risky since every time I boot my server up it is quite a production setting up the network connectivity. (I use the server to host a web site.) Maybe I will try it next time the power goes down and I need to boot up the system.
Well depending how critical the aystem is and your space constraints etc... i usually op for saving the web root folder, config scripts and database files from /var/lib
Saves time, space and bandwidth.
If crap hits the fan, just a quick minimal install with select packages from a kickstart file and copying the files back and a selinux relable.
Thanks. I've made a tar ball of everything except a few directories with files I can download (like OpenCV, Eclipse etc). I did have to split the resulting file to put it on a disk but I understand that one can recover the .tar.bz2 file with cat.
When making web servers, I generally create a small cluster (web1, web2, web3) which just has apache and the vhost configuration on them. So there isnt a reason to back them up because restoring from a backup will take longer than provisioning a new server. In addition to that I create a replicated NFS group (storage1, storage2) that all of the web servers connect to. When the model is set up this way, you have a failover storage space, And it is easy to set up an Rsync backup of.
Also, in front of the web nodes I have a load balancer that round robins the requests.
Using a combination of Rundeck, Puppet, Chef or whatever you like, you can automate the setup and deployment of additional web and storage nodes to add to the pool.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.