-   Linux - Newbie (
-   -   Editing extremely large files, too large for memory? (

SirTristan 12-22-2009 02:35 PM

Editing extremely large files, too large for memory?
I'd like to edit the beginning and end of some very large files. These files are .sql dump files, many gigs in size (some 5GB-10GB), larger than even the RAM of the machine.

Currently I'm trying to use pico (via SSH), but that doesn't really work - either it loads the file after many minutes, with slow editing; or it never seems to load the file at all. It also takes up all my machine's resources.

How would I go about editing just the start and end of these files, without trying to load the whole file? Is there an editing command that would allow me to begin editing at the end?

markush 12-22-2009 02:58 PM

Hello SirTristan,

what do you mean with "editing"? If you only want to look at the beginning or the end of a file you may use the "head" and the "tail" commands. Look at the manpages.

Otherwise if you really want to make changes to the files, I'd suggest to use sed or a perlscript.


Quakeboy02 12-22-2009 03:06 PM

Try using the split command to split your file into smaller chunks. Edit the chunk with your data, then use the cat command to put it all back together.

All times are GMT -5. The time now is 01:42 PM.