Maximum file size
Does anyone know what is the maximum file size that can
be parsed by the AWK / or by linux in general? |
linux file sizes, pretty large about 1TB i believe,
awk, dependent on memory and awk itself, dunno with perl i have parsed very large files |
Unless you mean "line size" -- which is usually thought of as the number of bytes per record in a text-formatted file -- if the filesystem can store the file, then utilities like awk and sed can read it.
What error are you getting? |
I once ran into a 2 GiB limit with awk on a HP-UX machine. I think it depends on which c library it was compiled against. You can probably tell by looking for the 64 bit file IO functions in the binary:
Code:
strings $(which awk) |grep fopen |
sed also counts line numbers, there could be an issue if there is more than 2^32 lines streamed in or read from a file. I don't know if the following works, to print the 2^32 line number?? Anyone have a file this big laying around?
Code:
sed -n 4294967296p |
The most general answer is as many bytes as you can index with an off_t.
|
Quote:
|
And fopen64 is a non-standard function, to support 64-bit file pointers
|
All times are GMT -5. The time now is 10:58 PM. |