Merge lines from a file
Hallo to everyone!
I would like to merge lines of a huge file into a formatted one. Be the following file (it has 1.9GB) called `input.txt': Code:
2 Code:
2, 3, 5, 7, 11, 13, 17, 19, 23, 29, (...) 3999999797, 3999999799, 3999999869, 3999999881, 3999999901, 3999999911, 3999999919, 3999999937, Code:
$ sed -e :a -e N -e 's/\n/, /' -e ta input.txt > output.txt Code:
$ paste -s -d "," input.txt > output.txt Code:
$ paste -s -d "," input.txt | sed "s/,/, /g" > output.txt Code:
$ tr -t "," ", " < input.txt > output.txt - Does anyone have one or more solution for this problem? Thanks in advance. []s |
Hi,
Don't know if this will work with 4 billion numbers, but give it a try: awk 'BEGIN { ORS=", " } { print } ' infile > outfile or awk '{ printf("%d, ", $1) } ' infile > outfile There will be an extra ,<space> at the end of the line in the output file. Hope this helps. |
Code:
perl -00 -ple's/\n/, /g' file |
this sounds like homework, isn't it? the file is explicitly large so you can't use editor macros. you have to do the job with sed or awk...
but druuna helped you already :) vadkutya |
shell solution
I think this one will fail because of memory too:
paste -s -d, < input.txt | sed 's/,/, /g' > output.txt |
Thanks for all and sorry for delay in replying!!
I've used druuna solution: awk 'BEGIN { ORS=", " } { print } ' infile > outfile Well, now I have another problem to solve. My file has one line and this finish with ", " and I need to substitute by `.' Again, sed does not work properly because the memory issues. Any idea? |
All times are GMT -5. The time now is 01:03 AM. |