Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
Ok, I actually started to create a ghost using dd until I realized this isn't going to work. My webserver is actually in a data-center - a two hour drive from where I live. The disk I want to ghost is 160Gb in size, meaning the image will be 160Gb as well. I realize I can compress this with gzip/bzip but that will still leave me with a file at least 70GB in size (that's the total size of files I know it can't compress any further (zip files, jpgs, etc).
I have a very fast internet connection at home so I should be able to download that in about 2 hours. Only problem is, that will eat up precious datatraffic from my server (I have a 500Gb monthly limit).
There's a lot on that drive I'm simply not interested in. Entire folders containing huge zip files or thousands of images. If I somehow can exclude those, I'm sure that what's left will not be larger than a few GB.
I have a similar setup and I installed apache, php, mysql, etc. on my "play" server. Got it all configured.
Now I have a nightly rsync job that runs and syncs /var/www/html from my production server to my "play" server. You can easily exclude portions of the website (videos, documents, etc.) with rsync so that you're not moving too much data.