LinuxQuestions.org
Help answer threads with 0 replies.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 03-14-2012, 02:09 PM   #1
bodyofabanshee
LQ Newbie
 
Registered: Mar 2007
Posts: 7

Rep: Reputation: 0
data push from box to removable usb drive dies, then dies, then dies again.


an Oracle DBA playing SysAdmin so not a total idiot, but creeping up on it.

have a usb drive mounted and as I use cp to try to get about 70G of data from a staged drive ( no RAID )sitting on my redhat box onto the usb it dies, every damn time.

any of you brilliant boys have any genius for a girl geek. i'm dying here..
 
Old 03-14-2012, 02:28 PM   #2
lisle2011
Member
 
Registered: Mar 2011
Location: Surrey B.C. Canada (Metro Vancouver)
Distribution: Slackware 2.6.33.4-smp
Posts: 183
Blog Entries: 1

Rep: Reputation: 25
Copying to USB Drive

Try cpio.
BUT read the man or info file first.
 
Old 03-14-2012, 02:42 PM   #3
michaelk
Moderator
 
Registered: Aug 2002
Posts: 26,366

Rep: Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154
What type of data, single file archive, etc. Any idea how much data is being copied before it dies?
What is the make/model of drive?
Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
Any error messages?
 
Old 03-14-2012, 03:13 PM   #4
bodyofabanshee
LQ Newbie
 
Registered: Mar 2007
Posts: 7

Original Poster
Rep: Reputation: 0
Lisle,
will do, but a precursory glance tells me there may be a problem since I don't want to untar these files, they then become 300G and my usb is 250.
If nothing else, I'm now exposed to cpio. Thank you for that.
 
Old 03-14-2012, 03:28 PM   #5
bodyofabanshee
LQ Newbie
 
Registered: Mar 2007
Posts: 7

Original Poster
Rep: Reputation: 0
What type of data, single file archive, etc.

tar'd and gzipped

Any idea how much data is being copied before it dies?

it varies. last "death" was two files one a couple hundred M and one about 30G which died after 5.7

What is the make/model of drive?
Vendor: WD Model: 2500BEV External Rev: 1.75

Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
It was NTFS but I ran mke2fs, mounted it, no problem, small stuff goes off and on with not a single hitch

Any error messages?

yep.. all about out of memory. after which the kernal kills the process. ( this is a puzzle because the data is actually a backup that's going from our 0+1 Box to the non-raided staged drive with no problems, night after night. I just can't get it from that staged drive to the usp by taring/gziping them to this drive)


clearly cp isn't the ticket

Last edited by bodyofabanshee; 03-14-2012 at 03:30 PM.
 
Old 03-14-2012, 11:56 PM   #6
elfenlied
Member
 
Registered: Dec 2004
Posts: 83

Rep: Reputation: 8
Depending on how you formatted the drive with mke2fs will determine the maximum file size the paritition will support.

For example:

Ext2
Block size----------: 1 KB 2 KB 4 KB 8 KB
max. file size------: 16 GB 256 GB 2 TB 2 TB
max. filesystem size: 4* TB 8 TB 16 TB 32 TB

Ext3 has similar limits see these links for more info.
http://en.wikipedia.org/wiki/Ext2
http://en.wikipedia.org/wiki/Ext3

I'm going to guess you've formatted the disk with 1KB block size and that's why its dying, NTFS would have worked fine but you might not have been able to mount the NTFS partition in RW.
 
Old 03-15-2012, 06:06 AM   #7
bodyofabanshee
LQ Newbie
 
Registered: Mar 2007
Posts: 7

Original Poster
Rep: Reputation: 0
elfenlied,
Since there's nothing on it, is it possible to rerun mke2fs? The first time I ran is as follows mke2fs -j -m 1
Also, if I understand the layout of the following:

Block size----------: 1 KB 2 KB 4 KB 8 KB
max. file size------: 16 GB 256 GB 2 TB 2 TB
max. filesystem size: 4* TB 8 TB 16 TB 32 TB


and it is a matter of the block size being too small, why would it die at 1G sometimes, and others get all the way to 6G, and would that account for the syslog saying the process ran out of memory and was killed?

Thank you for responding to this question
 
Old 03-15-2012, 07:33 AM   #8
Zetec
Member
 
Registered: Jul 2006
Distribution: Debian, Ubuntu, W7, openSUSE, Centos
Posts: 152

Rep: Reputation: 25
Plan "B" could include putting the USB drive onto a workstation and trying to copy the data from the server to a USB drive on another workstation using SCP or something? Just thinking out loud.
 
Old 03-15-2012, 10:53 AM   #9
bodyofabanshee
LQ Newbie
 
Registered: Mar 2007
Posts: 7

Original Poster
Rep: Reputation: 0
Zetec,
It may come to that...
 
Old 03-15-2012, 11:03 AM   #10
Reuti
Senior Member
 
Registered: Dec 2004
Location: Marburg, Germany
Distribution: openSUSE 15.2
Posts: 1,339

Rep: Reputation: 260Reputation: 260Reputation: 260
If it’s a limit because of the file size, you can try the command split with the option -b and a proper size to write pieces of the file to the USB-disk. On the target side you can use cat to concatenate them again. And maybe check the result later on with md5sum.
 
Old 03-15-2012, 12:30 PM   #11
bodyofabanshee
LQ Newbie
 
Registered: Mar 2007
Posts: 7

Original Poster
Rep: Reputation: 0
update, or more questions than answers depending on your point of view,


what is literally happening is that my system is running out of memory. We have plenty of memory for daily use, for moving the backup to the staged drive etc, but it gets maxed out EVERY time i use cp to try to put these files on the usb drive.

Is there a way to keep cp from journaling or whatever it's doing.. just pipe the dang thing?
 
Old 03-15-2012, 12:34 PM   #12
michaelk
Moderator
 
Registered: Aug 2002
Posts: 26,366

Rep: Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154Reputation: 6154
Post the output of the command
free -m
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Login window dies when user authentication dies. Hom Linux - Newbie 2 03-09-2011 06:52 AM
irexec --daemon dies after box goes idle. engtech Ubuntu 0 06-19-2008 04:57 PM
if qemu dies, my mouse dies with it eantoranz Linux - Software 3 11-02-2007 11:30 PM
x server dies...dies...dead! aquaboot Linux - Software 2 08-28-2005 12:54 PM
how to I have my box automaticall reconnect to my wireless network after it dies? JackSmith Linux - Newbie 1 06-12-2005 12:58 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 03:15 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration