splitting large files to smaller parts
i wanna split large files (larger than 1GB) to small parts.
i know i can use split/cat commands to do this, but is there any applications with a gui that can do the same? i have tried Karchiver, which can archive and then split files, but it is really unstable and slow...perhaps because of the large file sizes. thanks |
If it's an ascii file, you should be able to use console tools - I don't know what bash's largest integers are (long?), but something like:
for ((a=1;a<$lastlinenum;a+=$howmany));do [output lines] $a $a+$howmany > file_$a.txt;done you need special brackets for $a+howmany to evaluate it, I don't know which kinds of brackets they are; I don't know what command you'd use for [output lines] You can later just put them back together by cat file_* >> wholefile.txt Hope this works for you. Samsara |
thanks for the suggestion, but my point was not to use console. if i were to use console i would think its easier to simply use split command and then cat to join them.
but i dont want to use console...i want to use an application with a gui. but thanks anyways :) |
All times are GMT -5. The time now is 10:01 PM. |