LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 01-22-2013, 04:06 PM   #1
Hoxygen232
Member
 
Registered: Jan 2013
Posts: 37

Rep: Reputation: Disabled
bash - print in a file on the same first line in a loop


I need to print a $WORD in this file $FILE_OUTPUT on the same line in a while loop,

if I use
Code:
while read line;  
do 
  ......
  echo -n "$WORD" >> $FILE_OUTPUT
done < $FILE_INPUT
it doesn't work well because the output file is in this way:

Code:
WORD1
WORD2
WORD3
instead I would like to see this:

Code:
WORD1 WORD2 WORD3
words separeted by spaces


Thanks

Last edited by Hoxygen232; 01-23-2013 at 08:55 AM.
 
Old 01-22-2013, 04:21 PM   #2
antegallya
Member
 
Registered: Jun 2008
Location: Belgium
Distribution: Debian
Posts: 109

Rep: Reputation: 42
Hello,
it should work just fine if $WORD has no endline, how do you construct it ?
To be sure of it, try
Code:
echo -n "$WORD" | tr -d "\n" >> $FILE_OUTPUT
 
Old 01-22-2013, 04:34 PM   #3
Hoxygen232
Member
 
Registered: Jan 2013
Posts: 37

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by antegallya View Post
Hello,
it should work just fine if $WORD has no endline, how do you construct it ?
To be sure of it, try
Code:
echo -n "$WORD" | tr -d "\n" >> $FILE_OUTPUT
it still doesn't work, those words come from a .txt file (simple words dictionary)
 
Old 01-22-2013, 04:59 PM   #4
antegallya
Member
 
Registered: Jun 2008
Location: Belgium
Distribution: Debian
Posts: 109

Rep: Reputation: 42
your $WORD variable is not the $line variable, so you must be processing the content of $line somehow to obtain $WORD. I was asking what that process was.
Anyway, as removing the end of line with tr doesn't work and the ".txt" extension, I suspect your input file is actually is dos format where end of lines are not only a line feed "\n" but a carriage return followed by a linefeed "\r\n".
Either convert your input to the unix format with
Code:
dos2unix input_file.txt
or trim the carriage return of your input line by using something like :
Code:
while read line;  
do
  line=$(echo $line | tr -d "\r")
  ......
  echo -n "$WORD" >> $FILE_OUTPUT
done < $FILE_INPUT
 
Old 01-23-2013, 03:56 AM   #5
Hoxygen232
Member
 
Registered: Jan 2013
Posts: 37

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by antegallya View Post
your $WORD variable is not the $line variable, so you must be processing the content of $line somehow to obtain $WORD. I was asking what that process was.
Anyway, as removing the end of line with tr doesn't work and the ".txt" extension, I suspect your input file is actually is dos format where end of lines are not only a line feed "\n" but a carriage return followed by a linefeed "\r\n".
Either convert your input to the unix format with
Code:
dos2unix input_file.txt
or trim the carriage return of your input line by using something like :
Code:
while read line;  
do
  line=$(echo $line | tr -d "\r")
  ......
  echo -n "$WORD" >> $FILE_OUTPUT
done < $FILE_INPUT


thanks but it doesn't work either. I tried both commands you gave me.

this is all my loop:

Code:
dos2unix $FILE_INPUT
   
   while read line;   
   do  
    if [ -z "$line" ]  
    then
      echo "......."
      echo 
    else               
      echo "... $line"
      WORD=$(awk "NR==$line{print;exit}" $dictfile)   # it search in a dictionary
      echo -e "\t$WORD"
      echo
   echo -n "$WORD" >> $FILE_OUTPUT
    fi
   done < $FILE_INPUT
it still prints the words one in each line intead of being on the same line.
 
Old 01-23-2013, 05:08 AM   #6
antegallya
Member
 
Registered: Jun 2008
Location: Belgium
Distribution: Debian
Posts: 109

Rep: Reputation: 42
Okay, the file that actually needs the converstion is the file from where $WORD comes from. You have to convert your $dictfile in that case. Or again, if you don't want to modify your file, you could trim the CR at the definition of $WORD like
Code:
WORD=$(awk "NR==$line{print;exit}" $dictfile | tr -d "\r")
Don't forget to add a space between your words too when you echo it to your output file
Code:
echo -n "$WORD " >> $FILE_OUTPUT
 
1 members found this post helpful.
Old 01-23-2013, 07:28 AM   #7
Hoxygen232
Member
 
Registered: Jan 2013
Posts: 37

Original Poster
Rep: Reputation: Disabled
great, it works perfectly thanks
 
Old 01-23-2013, 07:38 AM   #8
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
The simple way would be to move the redirection from the "echo -n $word" to the "done" where you redirect stdin. The "done" would then look like "done < $FILE_INPUT >$FILE_OUTPUT"
 
Old 01-23-2013, 11:06 PM   #9
cbtshare
Member
 
Registered: Jul 2009
Posts: 645

Rep: Reputation: 42
something as simple as
Quote:
while read line;
do
echo -n "$line " >>out.txt
done < in.txt
works fine
 
Old 01-24-2013, 01:13 AM   #10
konsolebox
Senior Member
 
Registered: Oct 2005
Distribution: Gentoo, Slackware, LFS
Posts: 2,248
Blog Entries: 8

Rep: Reputation: 235Reputation: 235Reputation: 235
I think it's more efficient if you just use sed:
Code:
WORD=$(exec sed -n "${line}{ s@\\r@@; p; q; }" "$dictfile")
Btw next time please give all the details. It wasn't expected that you were reading another file.

Last edited by konsolebox; 01-24-2013 at 01:17 AM.
 
Old 01-24-2013, 06:15 AM   #11
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
Quote:
Originally Posted by cbtshare View Post
something as simple as


works fine
Just don't run it twice...
 
Old 01-27-2013, 07:43 AM   #12
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Arch + Xfce
Posts: 6,852

Rep: Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037
Code:
mapfile -t lines <infile.txt
echo "${lines[*]}" >outfile.txt
The mapfile command (bash v4+) loads the contents of the input file into an array, one line per entry.

The "*" expansion of the array prints the entire contents of it as a single unit, with the first character in your IFS variable separating the individual entries. By default this is the space character.

You can also use mapfile with internal commands, BTW, by using a process substitution, or a here document/here string in combination with a command substitution.

Code:
mapfile -t lines < <( mycommand )

mapfile -t lines <<<$( mycommand )
 
Old 01-27-2013, 08:24 AM   #13
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
Just remember that there is a limit on the number of lines you can use in that echo.. it is fairly large (10,000 I believe), but anything larger will fail.
 
Old 01-27-2013, 08:47 AM   #14
David the H.
Bash Guru
 
Registered: Jun 2004
Location: Osaka, Japan
Distribution: Arch + Xfce
Posts: 6,852

Rep: Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037Reputation: 2037
@jpollard. Yes, but I can't imagine that he's going to be searching for that many words at once. And actually, modern bash seems to be kind of smart when it comes to large array use. I just ran a test where I loaded up an array with more than 50k filenames, and it had no trouble printing them all out, either with echo or a for loop.


Anyway, after looking at the above code carefully, since I see that you want to do some on-screen printing as well, I think something like this might be a bit more appropriate.

Code:
dos2unix "$file_input"

while read line; do  

    if [[ -z "$line" ]]; then

        echo "......."
        echo 

    else               

        words+=( "$( awk -v ln="$line" -v RS'\r?\n' 'NR==ln{print;exit}' "$dictfile" )" )
        echo "... $line"
        echo
        echo "${words[-1]}"

    fi

done < "$file_input"

echo "${words[*]}" >> "$file_output"
We're still doing the same thing, but now we're just adding each entry to the array inside the loop, and waiting until the end to echo them all out to file together.

The negative index number in "${words[-1]}" prints only the final element of the array, and is also new to bash 4.2. For earlier versions "${words[@]-1)}" will do the same job.

In the awk command, I imported the $line number into an awk variable first instead of inserting it directly, and I set the RS value to one that can handle both dos and unix-style line endings. Although it would be better in the long run if you could just run dos2unix on the $dictfile instead.

Notice also my demonstrations of cleaned up formatting, double brackets, and quoted variables (very important!).

Last edited by David the H.; 01-27-2013 at 08:49 AM.
 
Old 01-27-2013, 08:50 AM   #15
jpollard
Senior Member
 
Registered: Dec 2012
Location: Washington DC area
Distribution: Fedora, CentOS, Slackware
Posts: 4,912

Rep: Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513Reputation: 1513
It isn't the size of the data array, it is the number of parameters allowed for a command. echo, being built in just might be able to bypass that, but in the general case, you are limited. That is why the loops using echo -n work on any data, not just small data.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
BASH: Read entire file line in for loop clinton Programming 16 04-18-2013 12:06 PM
vim flags a for loop error depending on file extension in the 'for' line? Dyspeptic Curmudgeon Linux - General 2 09-11-2012 08:49 AM
How can I read a file line by line and add it to a loop in another file? astroumut Linux - Newbie 7 08-24-2009 04:37 AM
shell script loop though text file:-1 line at a time. knockout_artist Linux - Newbie 2 05-04-2008 06:58 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 08:27 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration