SED has problems with Ram
hello!
i am really worry about sed: Couldn't re-allocate memory it's task is to replace sth in a file of a size about 250 mb . perhaps one of the 2 ram blocks inside the computer's hardware can be the reason of this. i remember myself that there have already been bluescreens some years ago when windows was running on this computer... but during using linux there has no longer been a problem with the rams till today :-( or is there another problem ? is there a parameter to limit SED's power ? it would take longer but perhaps there was no error ..? or can i use an alternative for sed to replace sth in a file of this size? please help thanks |
I believe sed sets an arbitrary memory limitation. Use perl instead.
Code:
cat myfile | perl –p –e 's/old/new/g;' > newfile Code:
perl –i –p –e 's/old/new/g;' myfile |
hello!
now i tried your tipp. unsuccessfully. the reason may be that there is no \n inside the 250mb big file ! this was job of sed... so what can i try now? thanks |
I can think of a few options now.
A) Provide more information so that a perl based program can work. B) Perhaps changing the sed program so that it uses less memory C) Breaking the file into manageable pieces. You haven't supplied much particular information, about either the file you are trying to change, or the sed program you are using. |
hello world!
now i wrote my own script in php: Code:
<? |
I'm not sure if sed preloads a file before it is worked on. Another possibility is if the file is compressed to eliminate whitespace and returns. Sed will match the longest matching pattern on a line, so if there are no returns, trying to find a match on a 256 MB line would overtax sed.
Anyway, I'm glad that you found a solution. |
All times are GMT -5. The time now is 09:32 AM. |