External USB 2.0 HD - fails after copying 10-20%
I've got an AcomData (Western Digital) 250GB External USB2.0/Firewire drive I just picked up for backups. I'm afraid that if I don't tread cautiously, I'll need a backup drive for my backup drive. My Nforce1 mobo only has USB 1.1 ports and I've tried plugging straight into the computer as opposed to any 3rd party cards or hubs. All the results are the same. I am not able to determine what USB controller chip the drive is using, nor am I very sophisticated with USB in general under Linux so any advice there would be appreciated.
Anyway, I -upon plugging the drive in- QParted the drive, reformatting it to reiserfs from Fat32. So all seems well, I'm moving files over to it, bumping in to the usual permission issues that I hadn't initially planned out (more on that in a sec) and WHAM! I get an error message saying that the drive is no longer available.
I try and re-access the drive, Konqueror freezes. I use Konsole to ls -lg the drive, and Konsole locks. I try and shut down the machine and it locks while unloading devices. I force the machine to turn off, start it up again, and look at the drive. Mounts fine. All that HAD actually copied is on the drive and executable. My A-HA MP3's still sound like they should. :P
I try copying again... same.
At first I thought that it was choking on file permissions... I had done some file moving and creating using a Windows machine (yeah, I know). Certain files and folders were owned by '1001' which I assume is Linux's way of interpreting a Windows owned file, but chmodding it to my primary user has not solved all those obstacles. I'm still getting a few folders and files which refuse to be copied, even by root.
As for the mounting end, I've tried just turning the drive off and back on, but it sets it to sdd1 as opposed to sdc1 and I'm afraid it's going to screw up and mistakenly lay a new filesystem down cause the other drive was improperly unmounted. This happened to me with a previous drive.
Does any one have any similar experiences or suggestions as to where I should look?
Thanks in advance,
I have the exact same problem. Under windoze/fedora c3 as well. The problem is that you are moving too many files at once. I dont know why that matters but when i tried to back up 10GBs worth of music at once, it died. I did it bit by bit, about 100mb at a time, and it worked fine.
What drive are you using, and do you know which controller it uses?
I originally thought that it was a bug in Linux's USB drivers, which I had read about elsewhere, but after buying a FireWire card and using the FireWire ports on my same drive I'm not getting any different result. The fact that some people are getting unhindered performance with their drives makes me think that it's the controller chip inside the drive carriage.
Under Windows, it mounts and runs fine, but any usage brings the whole system to it's knees which hasn't happened with other external drives. Funny that this ACOM drive was on sale. :(
It may be suffering the same problems under Linux, but the kernel isn't letting the drive take over the bus and so it just craps out. ...tis but a theory, though. Don't actually know what's going on in there.
Can anyone vouch for a good External Drive or External Carriage (preferably USB 2.0 and FireWire in combination) that has been working well under Linux, specifying your distribution and if possible the controller chip?
Thanks in advance,
i'm not using an enclosure at all.
try rmmod ehci_usb -- there's a page out there that says that fixes time out problems.
it's a maxtor drive and a samsung drive. maxtor doesnt work at all and samsung is the one that times out.
Well if your drive is external, using USB, it's using an enclosure. It may not be an empty enclosure, but one provided by Maxtor or Samsung, but it's an enclosure and has a USB/Firewire to IDE bridge controller which regulates the flow of data between the various I/O specifications. With Firewire, the Oxford chip has been a old favorite in terms of throughput and stability, but I imagine that there are lots of companies buying cheap controllers to fatten their margins a little.
Knowing what controller is being used will be super important. I'm sure (& perhaps I'll test this) that if I took the Western Digital HD out of the casing and slapped on an IDE port it'd work fine under Linux. In fact I've got an 80 and 120GB drive which offer no problems.
Now it MAY just be crap USB and Firewire drivers in the kernel, but I'm willing to guess (since the Linux community isn't exactly in an uproar over this) it's limited to a few controller chips which cause these headaches. They may even be newer chips which are designed to play nice with Windows, but use features which aren't adherent to the USB 2.0 or Firewire 400 standards. Or Linux's driver set isn't fully compliant. Either way, the bridge controller is the key.
Greetings. Dehuszar is right, the controller chip in the case of the external device is just as important as anything else. I have a 3.5" USB 2.0/Firewire and also a 5.25" USB 2.0 enclosure(both from CompUSA), in which I have installed an IDE hard drive, and an IDE CD burner. Under Mandrake 10, everything works with both enclosures. Under Mandrake 10.1(slightly newer kernel version), the CD burner in the 5.25" enclosure does not work- cdrecord does not recognize any burners. Cdrecord -scanbus and lsusb gives the name of the chip in the enclosure, and that's it. It's like the kernel cannot see past that funky chip.
You have probably happened upon an incompatibility between the chip in your enclosure and your kernel version. You might upgrade your kernel. Also, my old CompUSA USB 2.0 with an Ali card would work great when reading from my external drives, but would crapp out when copying to them. I solved that by buying a good Belkin USB 2.0 card with an NEC chip- it now works great. That reminds me, I need to check the Ali card again sometime soon...
Hope this helps. If you buy a USB card, make sure it has the NEC chip. It was highly recommended on several forums I checked before buying it...
somehow i don't like the idea of the ide to usb bridge controller being the culprit
i have a similar problem which causes my entire system to lock up when i copy large amounts of data over the usb (and network). i have been fighting the problem for quite a while and have not gotten any closer to the source. i am starting to believe it is a hardware problem, somewhere between network controller and usb controller.
if it really is the bridge controller, that would mean that any poorly designed usb device can lock up any computer. and that would mean, considering the number of usb devices on the market, that many computers should lock up very often. but afaik that doesn't happen very often.
I have a similar problem, although not USB related.
I have a Western Digital 160gig hdd (8 meg cache) which is the only drive in my Fedora Core 2 system.
The whole machine locks up completely when I copy large files over the network.
It doesn't matter if it's over SMB or NFS, the same thing happens at random intervals, but every single time.
I'm wondering if this is a problem with large cache drives.
Maybe take the drive out of the box and put it inside the machine to see if you still get lockups, my guess is that you will.
many/most usb2-ide chips broken
for information about bad controller chips, including the
oxford chip! The problem does not show up when the
external USB2-IDE chip/drive is not stressed with sustained
file transfers. I just hit this with expensive SilverRiver
(thermaltake) USB2 enclosures and moving 12G raw video files
on a current UBUNTU box. Trouble. Most people don't hit
this. Most people aren't pushing 12G files onto their USB2 drives.
What I don't know is if a workaround has made its way into
current or rc series 2.6 kernels. The bug is in the USB2-IDE
chips that don't implement the spec. But reality says the kernel
drivers will have work around these broken chips because
they're in most(?)/many(?) of the USB2 enclosures.
Anyone know status of workarounds for these buggy chips?
Well, I picked up a Firewire card for my primary machine and the problems went away. So it could be a combination of the NForce1's USB1.1 port and the ACOM's bridge (USB2.0-side ...it's an Oxford incidentally). Don't know if it's the 2.0->1.1 part of the chain, or if the NForce's controller is real picky. Either way, avoiding USB solved the problems. Firewire seems to take way less of a CPU hit anyway, so I'm happy as could be.
And obviously, these problems do crop up with some frequency or we wouldn't be here talking about all having the same types of problems. :P
Anyway, hope that helps someone.
|All times are GMT -5. The time now is 11:13 AM.|