Combine 10,000 small files into a big one
I got around 10,000 files inside a folder, each of them contains around 20-30 lines.
Is there some quick way (probably bash command) to combine them into a big one ?
This number of files cause 'cat *' to fail and it reminds me 'argument list too long'.
A Perl program is working on combing it anyway, but just a little bit slow to me.
This sort of thing?
*gets more coffee*
Assuming they're all in a single directory and there is nothing else there.
find . -maxdepth 1 |grep -v <outputfile> |xargs cat >> <outputfile>
Of course you'd want to make sure where ever your outputfile is has enough space free to contain all your files output.
|All times are GMT -5. The time now is 05:49 PM.|