Combine 10,000 small files into a big one
I got around 10,000 files inside a folder, each of them contains around 20-30 lines.
Is there some quick way (probably bash command) to combine them into a big one ? PS: This number of files cause 'cat *' to fail and it reminds me 'argument list too long'. A Perl program is working on combing it anyway, but just a little bit slow to me. |
This sort of thing?
Code:
for i in `ls -1`; do *gets more coffee* |
Assuming they're all in a single directory and there is nothing else there.
cd <directory> find . -maxdepth 1 |grep -v <outputfile> |xargs cat >> <outputfile> Of course you'd want to make sure where ever your outputfile is has enough space free to contain all your files output. |
All times are GMT -5. The time now is 05:39 AM. |