Difference between cloud and cluster
Hi all,
I have read several articles online trying to get a grasp and understand on a simple question. What is main difference and functions of cloud server infrastructure and cluster server infrastructure (If Iam using wrong terminology please correct me).
To my understanding, the the basic setup is sort of identical in a way. They both have a master server (3 GHz, 2GB RAM 50GB HDD), from there it connectes to a switch, then a number of nodes (each 2 GHZ, 4 GB RAM, 100GB HDD) connect to the switch.
The cluster combines node resources so it looks like one computer (5 nodes = 2GHZ x 5, 4GB x 5, 100GB x 5 = 10GHZ, 20GB RAM, 500GB HDD). If this is correct, this sounds good for possibly Database or File server. The resources on cluster would be constant. Example would be Database Server. While running DB server it starts to slow down because of all the data it is trying to store and retreive. If you ned more resources, just add another node.
The cloud uses only resources it needs to run (same numbers as cluster). This is good for Database or Web Server. Example: Web server is hosting 10 domains each with 20 pages. One hour it uses 3 GHZ and 8 GB RAM, next hour it is getting heavy traffic so it uses 9 GHZ, 15GB RAM.
Please correct me if my way of explaining or thinking is wrong. But from what I have gathered and read, this is how I am interpreting the information and understaing it. Plain and simple.
Thanks.
PS..Oh, and the numbers are just examples. I know they are not the minimum, but for the sake of this explanation I just used simple numbers to help explain and understand.
Last edited by karnac01; 10-05-2010 at 04:06 AM.
|