LinuxQuestions.org
Help answer threads with 0 replies.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices


Reply
  Search this Thread
Old 02-23-2005, 07:48 AM   #1
ghight
Member
 
Registered: Jan 2003
Location: Indiana
Distribution: Centos, RedHat Enterprise, Slackware
Posts: 524

Rep: Reputation: 30
ReiserFS trouble or rash or bad HDDs?


As the Subject states, has anyone noticed an issue with ReiserFS destroying hard drives? In the past several months, I've had 3 hard drives in my main desktop computer go south within weeks of installing a ReiserFS root partition.

If I take these drives and run the vendor diagnostic utils on them they all check out fine, but when I reload the OS and put ANY filesystem on it after that, it throws errors almost immediately.

This is a custom built computer, is relatively new (within a year), and all OSs use a stock kernel. I've run EXT3 on it without a problem for almost 6 months before I tried Suse.

I have thought about the motherboard being bad, but I have another hard drive on the same IDE channel (running the "other" OS. Hey Call of Duty doesn't run on Linux!) that hasn't had any problems.
.
 
Old 02-23-2005, 11:24 AM   #2
J.W.
LQ Veteran
 
Registered: Mar 2003
Location: Boise, ID
Distribution: Mint
Posts: 6,642

Rep: Reputation: 87
Please post the complete specs of your system, including the brands. I personally use reiser on all my machines (note that it's the default FS for both Slack and Suse) and at least thus far I haven't run into any issues using it. Clearly if you've experienced a series of drive errors then something in your PC isn't happy, but I don't think it would be your choice of file system.

This is speculation, but it could be that the hard drive is a second-tier brand, or perhaps you may be overloading your PSU and it needs to be upgraded. Along those lines, it could be some sort of electrical problem - if the problem always (and only) affects the primary slave (for example) then it probably is the mobo, or maybe there's some sort of defect with the ribbon cable. Perhaps you could relocate your Linux drive to another position to see if that makes any difference.

If none of those thoughts check out, then it seems that your fall back position would be to use ext3, given that it didn't misbehave. Again this is a lot more guesswork than it is analysis. Good luck with it -- J.W.
 
Old 02-23-2005, 12:56 PM   #3
ghight
Member
 
Registered: Jan 2003
Location: Indiana
Distribution: Centos, RedHat Enterprise, Slackware
Posts: 524

Original Poster
Rep: Reputation: 30
It's a relatively newly built computer, but the motherboard has been available for quite a while. I bought stuff that had been out for a while becuase they were dirt cheap, but still brand new in the box. I believe it's a Syntax 266 motherboard or something close to that with an AMD Athlon 2.0GB CPU. The drives are different brands. The first 2 were older drives but they were IBM and Western Digital. The last one was a brand spanking new Maxtor 80 GB. At this point, it could be anything, but the power supply is a 350W Antec so it's not overloaded for sure.

It's hard to believe it's just a fluke thing, but I didn't expect anyone to chime in with "Hey, that's happened to me!" The easy solutions never seem to happen to me. I guess I will stick with ext3 and forget about it.

For whats its worth, I had a FreeBSD machine recently lose a drive as well, but this was a completely different computer. I've never had this many drives go bad in my life and I work with computers everyday. I have a whole stash of drives that I keep in a big plastic container. Maybe by some freak thing, something happened around the box that has caused my problem. Hey, I'm just guessing too, but it's all I can do.

Thanks.
 
Old 02-23-2005, 05:47 PM   #4
J.W.
LQ Veteran
 
Registered: Mar 2003
Location: Boise, ID
Distribution: Mint
Posts: 6,642

Rep: Reputation: 87
If you've lost this many drives, then there must be some kind of environmental factor involved. Have you considered installing a power conditioner? If you're getting random surges and dips, those could contribute to a shortened life. Either way, just as you said there's no valid reason for so many drives to fail in such a short period of time. It could just be bad luck, but if the count is up to 4, the odds are against it. Sorry I can't be more helpful. Good luck with it -- J.W.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
ReiserFS with bad blocks bruce ford Linux - Software 2 07-23-2005 04:15 AM
Bad blocks - reiserfs corrupted babis Linux - Hardware 3 04-03-2005 09:05 PM
Reiserfs, hacker and bad blocks rkane Linux - General 2 01-28-2005 02:58 PM
bad block in reiserfs ngoddard SUSE / openSUSE 4 12-01-2004 02:58 AM
ReiserFS - Bad Blocks - Recovery - HELP! leprechaun Linux - Hardware 1 04-10-2004 07:13 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware

All times are GMT -5. The time now is 01:10 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration