Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
As the Subject states, has anyone noticed an issue with ReiserFS destroying hard drives? In the past several months, I've had 3 hard drives in my main desktop computer go south within weeks of installing a ReiserFS root partition.
If I take these drives and run the vendor diagnostic utils on them they all check out fine, but when I reload the OS and put ANY filesystem on it after that, it throws errors almost immediately.
This is a custom built computer, is relatively new (within a year), and all OSs use a stock kernel. I've run EXT3 on it without a problem for almost 6 months before I tried Suse.
I have thought about the motherboard being bad, but I have another hard drive on the same IDE channel (running the "other" OS. Hey Call of Duty doesn't run on Linux!) that hasn't had any problems.
Please post the complete specs of your system, including the brands. I personally use reiser on all my machines (note that it's the default FS for both Slack and Suse) and at least thus far I haven't run into any issues using it. Clearly if you've experienced a series of drive errors then something in your PC isn't happy, but I don't think it would be your choice of file system.
This is speculation, but it could be that the hard drive is a second-tier brand, or perhaps you may be overloading your PSU and it needs to be upgraded. Along those lines, it could be some sort of electrical problem - if the problem always (and only) affects the primary slave (for example) then it probably is the mobo, or maybe there's some sort of defect with the ribbon cable. Perhaps you could relocate your Linux drive to another position to see if that makes any difference.
If none of those thoughts check out, then it seems that your fall back position would be to use ext3, given that it didn't misbehave. Again this is a lot more guesswork than it is analysis. Good luck with it -- J.W.
It's a relatively newly built computer, but the motherboard has been available for quite a while. I bought stuff that had been out for a while becuase they were dirt cheap, but still brand new in the box. I believe it's a Syntax 266 motherboard or something close to that with an AMD Athlon 2.0GB CPU. The drives are different brands. The first 2 were older drives but they were IBM and Western Digital. The last one was a brand spanking new Maxtor 80 GB. At this point, it could be anything, but the power supply is a 350W Antec so it's not overloaded for sure.
It's hard to believe it's just a fluke thing, but I didn't expect anyone to chime in with "Hey, that's happened to me!" The easy solutions never seem to happen to me. I guess I will stick with ext3 and forget about it.
For whats its worth, I had a FreeBSD machine recently lose a drive as well, but this was a completely different computer. I've never had this many drives go bad in my life and I work with computers everyday. I have a whole stash of drives that I keep in a big plastic container. Maybe by some freak thing, something happened around the box that has caused my problem. Hey, I'm just guessing too, but it's all I can do.
If you've lost this many drives, then there must be some kind of environmental factor involved. Have you considered installing a power conditioner? If you're getting random surges and dips, those could contribute to a shortened life. Either way, just as you said there's no valid reason for so many drives to fail in such a short period of time. It could just be bad luck, but if the count is up to 4, the odds are against it. Sorry I can't be more helpful. Good luck with it -- J.W.