Multiple grep outputs appended to single row of CSV file
how to update a series of values from multiple grep commands outputs to be appended to a single row of a csv file?
Work on a linux envir. The values from grep output will be numeric values. Output sold look like: 1,3,4,5,7,0,5 Each of these values will be odtained from multiple grep commands piped with wc -l Is it possible to update a single row of a csv file if so pleas ehelp me with the command to be used to redirect the output into the csv file |
Something like this?:
((grep a * | wc -l; grep b * | wc -l) | tr '\n' ','; echo) >> bash.rocks.csv |
Quote:
the invocation of 'wc'. Code:
((grep -c a *; grep -c b *) | tr '\n' ','; echo) >> bash.rocks.csv Tink |
Hi Tinkster! In this very example (when greping files) your modification gives different results.
However, in general "grep -c" is in fact better, and should be preferred over "grep | wc -l". |
My bad - still waking up ... it would work for individual files.
With the * it won't, as it doesn't do summaries. 'wc -l' is indeed the way to go :} Cheers, Tink |
Or we could use a single awk:
Code:
awk '/a/{sum[0]++}/b/{sum[1]++}END{while(i <= 1)printf sum[i++]","}' file |
All times are GMT -5. The time now is 12:37 PM. |