Review your favorite Linux distribution.
Go Back > Forums > Linux Forums > Linux - Software
User Name
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.


  Search this Thread
Old 09-02-2003, 03:27 AM   #1
LQ Newbie
Registered: Sep 2003
Location: Serbia and Montenegro
Distribution: Mandrake 9.1
Posts: 2

Rep: Reputation: 0
Unhappy "pipe: memory exhausted" problem

Hi all.

I'm running into this problem with one of my scripts. The script mangles a large amount of ASCII data (~400 MB). The script concatenates N files with arbitrary number of records (records are of fixed number of lines). This is piped to "awk", which inserts a separator line after a selected number of lines. This is at last piped to "csplit" which splits the STDIN on those separators.

Just in case you haven't figured it out, this is a scrpit to unify batch size for the input processing.

The script works flawlessly on Tru64 UNIX and ZSh 2.x

When run on Linux Mandrake 9.1 and ZSh 4.1.0 it beltches after certain number of output batches have been created. The error reported is:

csplit: memory exhausted

BASh 2.0.5b reports "broken pipe" on this scrpit.

When I take out the last pipe and redirect output of "cat ... | awk ..." into a file it works OK. When I feed that file to "csplit", again, it works. The only thing I can conclude is that that last pipe is causing problems.

I don't think that increasing pipe size would help. I've tried reducing the output batch size. It still fails, only after several more data has been done. The error is totally reproducable in my case.

Is there some issue with "csplit" that it doesn't handle pipes?

Old 09-02-2003, 02:56 PM   #2
Registered: Jul 2003
Distribution: Slackware
Posts: 392

Rep: Reputation: 55
It looks like csplit is running out of memory, "crashing" as a result, and the pipe is therefore broken. I don't think the problem is the pipe itself, it's csplit running out of memory. It may work if you feed that file into split and not work in the full chain of commands because the other commands use memory too.

I'm not sure what to do; you could run "ulimit -a" and see if that's limiting the amount of memory you can use.

If I understand what you're doing, I'd be tempted to replace the whole thing with just plain split(1). It knows how to split a file or standard input into files with N lines in each. You could skip awk altogether.



Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
How to make a "Pipe" symbol? markw8500 Linux - Newbie 28 02-02-2020 10:56 AM
ifconfig usb0 produces "Broken pipe" JohnKFT Slackware 0 11-13-2004 05:56 PM
"Undeleting" data using grep, but get "grep: memory exhausted" error SammyK Linux - Software 2 03-13-2004 04:11 PM
newbie questions about "pipe" command 286 Linux - Newbie 6 12-20-2003 08:25 PM
"Pipe delivery post timed out?" kabads Debian 1 11-16-2003 09:33 PM > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 08:53 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration