LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices


Reply
  Search this Thread
Old 01-02-2006, 06:34 AM   #1
marsm
Member
 
Registered: Aug 2005
Distribution: Ubuntu
Posts: 62

Rep: Reputation: 15
Maintenance, longevity, saving energy


I'm looking for some general advice how I should treat my system.

I think it can be said generally, that the internal components of every maschine are at their most vunerable when they are switched on due to the high electrical surge running through them. This is why for example neon light bulbs usually finally fail when you try to turn them on. I suppose the same is (still) true for hard drives. These are my main concern right now, because thanks to Linux my computer has been running for almost two months constantly. My previous Windows installation was giving me BSODs galore as I watched in horror how my first hard drive was consumed by it.

Still I am beginning to wonder when I should actually shut it down for maintenance, since the CPU, GPU and PSU fans get clogged up with dust etc. I'd also like to know what the average failure in time rates can be for the above mentioned parts, since a five year old PSU of mine caused some nasty collateral damage (another hard drive) when it recently failed entirely.

I'm also curious about my TFT monitor: I want to save energy (and prevent ghost images) by switching it off every time I leave my desk. But doesn't this also damage the TFT like too much rebooting kills hard drives?

I know, this isn't a typical hardware 'problem', but since the people here are technically more adept and somewhat concerned (dual booting, long uptimes, ...) so I thought I'd put it here anyway.
 
Old 01-02-2006, 08:37 AM   #2
amosf
Senior Member
 
Registered: Jun 2004
Location: Australia
Distribution: Mandriva/Slack - KDE
Posts: 1,672

Rep: Reputation: 46
Well, I have 8 PC's running at this moment. They run 24/7, most run 100% cpu doing F@H in their spare time... So they use a bit of power. I have crt's and lcd set on a 10 minute shutdown... I worry about the lcd's especially Have to have them sleep when not in use due to power use... Of course 2 of the boxes don't have monitors, like the print server and router box...

I shut down to blow them out when I think about it, which isn't enough in this very dusty rural area... I think the print server went 4 years at one stage. You couldn't see the motherbord components (really old XT flip top desktop case)... Poor ol p200... This is a machine that has been running 24/7 for about 10 years now, tho it did start with a p120 cpu (which was upgraded, not a failure). Still has an old quantum bigfoot 5 inch 2 gig HDD... I blow it out every couple of years if I remember.

I have not had too much trouble with failures, but it happens. I get a PSU now and then, but then some have been going for years... The print server still has an old XT style PSU... Wow that thing must be old. I've lost a couple of AT PSU's and one ATX PSU this year on a Sempron machine. No damage from any of these. So the PSU's seem to last okay, but some less than 5 years and some go longer...

Most of the failures here are fans. PSU fans and video card fans - which sometime take out video chips. So fans are something I have to check now and then - which I always forget to do.

The last disk damage I got was when the primary IDE went on my p120 in 1999. That board is still going on the secondary IDE in the print server box. Had a hdd die in one machine recently, but don't lose a lot of drives...

No real numbers. Some things last forever. Others die a lot quicker than you want. Monitors especially. My 19" crt lasted a little past the 1 year warrenty period

Now you got me thinking. I don't think I have even looked at the latest router box which I set up about 3 years ago... So I think it might need a blow out...
 
Old 01-02-2006, 10:28 AM   #3
Crito
Senior Member
 
Registered: Nov 2003
Location: Knoxville, TN
Distribution: Kubuntu 9.04
Posts: 1,168

Rep: Reputation: 53
Only moving parts in a PC are the fans and disks, and not surprisingly those are the components that I seem to have problems with the most frequently. Some of the cr@ppy fans I've been getting lately should have MTBF's measured in seconds not hours. Fortunately hard disks have improved greatly since the days of having to manually park heads on every shutdown; they don't cause me nearly as much grief nowadays.

In any case, guess your "blowing out the dust bunnies" maintenance plan depends on many factors: environmental conditions, availability of a failover or hot spare server to minimize the amount of downtime (a high availability cluster would be even better), management's attitude (if it ain't broke don't touch it until it catches fire kind of attitude), etc...

Last edited by Crito; 01-02-2006 at 11:16 AM.
 
Old 01-02-2006, 10:56 AM   #4
onebuck
Moderator
 
Registered: Jan 2005
Location: Central Florida 20 minutes from Disney World
Distribution: SlackwareŽ
Posts: 13,925
Blog Entries: 44

Rep: Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159Reputation: 3159
Hi,

It's your electric bill! Thermal shock is the worst enemy.

Before I retired, we had lab machines that would run 24/7. Occasional HD failures, psu failure but few MB failures over a 14 year period. Most of our failed single unit systems were due to individuals that power cycled the systems. I mean users that would power up the system then power down to then power up again in just a few hours. These I would tell the person(s) involved to just leave the machines powered until the end of the day. Prefer if they would just leave them on but if they must power down to do so on a wide cycle period. If they were going to be gone for an extended period then power down if they must. Most would just power down the monitor.

Once others found that this was a success then they too would cycle in this manner.

Most modern systems have standby or shutdown cycling available. So to manually control is not a problem. As for newer monitors or LCD displays, I would cycle these down either via the OS or just manually shutdown.

HTH!
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
energy conservation Four Linux - Newbie 5 12-24-2005 11:55 PM
Longevity of CompactFlash-based storage? thegnu Linux - Hardware 1 02-03-2005 11:11 AM
Help, energy crisis kt_leohart Linux - Laptop and Netbook 0 11-17-2004 05:35 PM
how can I change energy saving settings darkleaf Linux - Software 11 09-21-2004 09:42 AM
iptables longevity? WeNdeL Linux - General 2 03-04-2003 02:18 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware

All times are GMT -5. The time now is 11:08 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration