Linux - Server This forum is for the discussion of Linux Software used in a server related context. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
03-14-2012, 02:09 PM
|
#1
|
LQ Newbie
Registered: Mar 2007
Posts: 7
Rep:
|
data push from box to removable usb drive dies, then dies, then dies again.
an Oracle DBA playing SysAdmin so not a total idiot, but creeping up on it.
have a usb drive mounted and as I use cp to try to get about 70G of data from a staged drive ( no RAID )sitting on my redhat box onto the usb it dies, every damn time.
any of you brilliant boys have any genius for a girl geek. i'm dying here..
|
|
|
03-14-2012, 02:28 PM
|
#2
|
Member
Registered: Mar 2011
Location: Surrey B.C. Canada (Metro Vancouver)
Distribution: Slackware 2.6.33.4-smp
Posts: 183
Rep:
|
Copying to USB Drive
Try cpio.
BUT read the man or info file first.
|
|
|
03-14-2012, 02:42 PM
|
#3
|
Moderator
Registered: Aug 2002
Posts: 26,366
|
What type of data, single file archive, etc. Any idea how much data is being copied before it dies?
What is the make/model of drive?
Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
Any error messages?
|
|
|
03-14-2012, 03:13 PM
|
#4
|
LQ Newbie
Registered: Mar 2007
Posts: 7
Original Poster
Rep:
|
Lisle,
will do, but a precursory glance tells me there may be a problem since I don't want to untar these files, they then become 300G and my usb is 250.
If nothing else, I'm now exposed to cpio. Thank you for that.
|
|
|
03-14-2012, 03:28 PM
|
#5
|
LQ Newbie
Registered: Mar 2007
Posts: 7
Original Poster
Rep:
|
What type of data, single file archive, etc.
tar'd and gzipped
Any idea how much data is being copied before it dies?
it varies. last "death" was two files one a couple hundred M and one about 30G which died after 5.7
What is the make/model of drive?
Vendor: WD Model: 2500BEV External Rev: 1.75
Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
It was NTFS but I ran mke2fs, mounted it, no problem, small stuff goes off and on with not a single hitch
Any error messages?
yep.. all about out of memory. after which the kernal kills the process. ( this is a puzzle because the data is actually a backup that's going from our 0+1 Box to the non-raided staged drive with no problems, night after night. I just can't get it from that staged drive to the usp by taring/gziping them to this drive)
clearly cp isn't the ticket
Last edited by bodyofabanshee; 03-14-2012 at 03:30 PM.
|
|
|
03-14-2012, 11:56 PM
|
#6
|
Member
Registered: Dec 2004
Posts: 83
Rep:
|
Depending on how you formatted the drive with mke2fs will determine the maximum file size the paritition will support.
For example:
Ext2
Block size----------: 1 KB 2 KB 4 KB 8 KB
max. file size------: 16 GB 256 GB 2 TB 2 TB
max. filesystem size: 4* TB 8 TB 16 TB 32 TB
Ext3 has similar limits see these links for more info.
http://en.wikipedia.org/wiki/Ext2
http://en.wikipedia.org/wiki/Ext3
I'm going to guess you've formatted the disk with 1KB block size and that's why its dying, NTFS would have worked fine but you might not have been able to mount the NTFS partition in RW.
|
|
|
03-15-2012, 06:06 AM
|
#7
|
LQ Newbie
Registered: Mar 2007
Posts: 7
Original Poster
Rep:
|
elfenlied,
Since there's nothing on it, is it possible to rerun mke2fs? The first time I ran is as follows mke2fs -j -m 1
Also, if I understand the layout of the following:
Block size----------: 1 KB 2 KB 4 KB 8 KB
max. file size------: 16 GB 256 GB 2 TB 2 TB
max. filesystem size: 4* TB 8 TB 16 TB 32 TB
and it is a matter of the block size being too small, why would it die at 1G sometimes, and others get all the way to 6G, and would that account for the syslog saying the process ran out of memory and was killed?
Thank you for responding to this question
|
|
|
03-15-2012, 07:33 AM
|
#8
|
Member
Registered: Jul 2006
Distribution: Debian, Ubuntu, W7, openSUSE, Centos
Posts: 152
Rep:
|
Plan "B" could include putting the USB drive onto a workstation and trying to copy the data from the server to a USB drive on another workstation using SCP or something? Just thinking out loud.
|
|
|
03-15-2012, 10:53 AM
|
#9
|
LQ Newbie
Registered: Mar 2007
Posts: 7
Original Poster
Rep:
|
Zetec,
It may come to that...
|
|
|
03-15-2012, 11:03 AM
|
#10
|
Senior Member
Registered: Dec 2004
Location: Marburg, Germany
Distribution: openSUSE 15.2
Posts: 1,339
|
If it’s a limit because of the file size, you can try the command split with the option -b and a proper size to write pieces of the file to the USB-disk. On the target side you can use cat to concatenate them again. And maybe check the result later on with md5sum.
|
|
|
03-15-2012, 12:30 PM
|
#11
|
LQ Newbie
Registered: Mar 2007
Posts: 7
Original Poster
Rep:
|
update, or more questions than answers depending on your point of view,
what is literally happening is that my system is running out of memory. We have plenty of memory for daily use, for moving the backup to the staged drive etc, but it gets maxed out EVERY time i use cp to try to put these files on the usb drive.
Is there a way to keep cp from journaling or whatever it's doing.. just pipe the dang thing?
|
|
|
03-15-2012, 12:34 PM
|
#12
|
Moderator
Registered: Aug 2002
Posts: 26,366
|
Post the output of the command
free -m
|
|
|
All times are GMT -5. The time now is 03:15 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|