how to loop over text file lines within bash script for loop?
Hi.
I have a data file containing 4 rows of data say: X1 Y1 Z1 X2 Y2 Z2 X3 Y3 Z3 X4 Y4 Z4 Associated with each individual row is a data file containing a subset of additional data for that row. These associated data files are single column files, but not all have the same number of columns. For example that data file associated with row 1 might be of form: R1a R1b R1c R1d Is there a way that I can pass the data file above (R1a, etc) to a for loop within a bash script such that it loops over the lines of the data file and outputs the following: X1 Y1 Z1 R1a X1 Y1 Z1 R1b X1 Y1 Z1 R1c X1 Y1 Z1 R1d - and so on for each record of the main data file. I've seen examples with individual files of directories being passed into for loops, as in: for i in * do ........ done But what is the equivalent code when you want to loop over text file lines? Thanks. |
There are many possible solutions. Here's mine, using gawk inside a while loop:
Code:
#!/bin/sh Hope this will help. |
Much easier
Set the internal file separator:
Code:
IFS=$'\n'; for line in `ls`; do echo $line; done |
this is an old thread.
first, no need to change IFS and using ls is "useless" . Code:
for line in * lastly, the tool to use could be simple paste/join. |
Why not just use paste ?
Code:
paste file1 file2 > file 3 |
Try Something like this
exec 3>&0 exec 0<$filename while read eachline do echo $eachline done exec 0>&3 Cheers |
Use cat
Better late than never :)
I was looking for the same answer and saw this post. Eventually, I came up with this: for line in `cat fileName.txt` do --- done |
My favorite:
Code:
while read line; do |
Nice work
|
Quote:
All the last few posts have done is create sone noise vaugely similar to whats required. This is possibly worse than going totally off post. |
All times are GMT -5. The time now is 12:54 AM. |