Hi All,
I have a home machine that runs several underutilized services such as ftp, ssh, firewall, about 6 vnc sessions (via ssh), and sometimes a guest browsing session w/ Gnome or xfce. Yay, big deal right?
[edit: Fedora Core 1]
Now that's all well and good but about 4 months ago 256 of my 512mb blew up and I have been limping along ever since. The output from free shows
total - 255512
used - 231924
buffers/cache used - 123184
I interpret that as realistically only 123MB of RAM are being utilized. Correct me if I'm wrong. If that is the case, then why did my machine start to chug after I lost 256?
Could it be processor? I use xload and/or top and passively watch all day long while I'm doing other work. I had originally thought that the load average could not exceed 1.00 until two weeks ago when I was hitting 1.61... Clearly this number does not relate to cpu utilization if it can exceed 100%. So my question is, what does this number represent? Here is the long answer...
http://www.teamquest.com/html/gunther/ldavg1.shtml
Instead, would someone clarify what the value means in real life? What can I infer from the load average about my processor utilization? What are the maximum values for Load Average? What do you consider "too high"? That article seems to indicate that not only is Load Average a poor tool for dertimining server sizing but it returns retarded (literal sense) information...
I don't think my machine is a slouch, AMD1.4GHz w/256MB but qualitatively I can tell that it's not working optimally. What quantitative tools can I use to tune this box to get the most bang? Do I add more RAM? How do I tell when the processor is no longer powerful enough?
Thoughts?