[SOLVED] awk multiple lines of output - print one line
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Awk has regex and conditional logic (amongst other features) - seems perfect for the job. The OP didn't show the code, but something like this might work
syg your solution is the more correct one here, but this could be easily accomplished with a much smaller statement:
Code:
grep -i bytes | awk '{print $7" "$8}'
It really depends how robust you want to make this, if you are just trying to pull out two numbers for yourself to see using grep/awk combo that makes sense. If you want to cron this or program it into a script then syg00's awk is the more correct method.
Its always right to use the best tool for the job. Why type out all of the extra junk if its for a one time use? And why would you not want to use a simple grep/awk pipe for a programmatic task that will be repeated often?
Answer: It is much simpler if you have a static set of input that you know exactly what it is and need to one off something quickly while you are sitting at the CLI. However, a simple grep/awk pipe is not reliable if your input changes.
I think we need to clarify a few more things here.
1) Is the output static, or are you filtering a continuing stream?
2) What exactly is "host"? It doesn't appear in the input text as shown, so where does it come from? Should it only print once, or is it a stand-in for something that changes in (or depending on) the input?
3) Similarly, do you want all output to appear on a single line, or are there separate lines for different entries, or what?
Making the assumption that "host" is a fixed value, and that you want all "KBytes" values to appear after it, I'd probably use something like this:
The content is static, it is the output from a bi-directional iperf.
"Host" will be the ip of the system that just ran the iperf. I have an array of IP's that run through a loop and write the output to a file. So the file will look something like:
Code:
10.10.10.5 919 42.1
So ideally the file once the script finishes will look like the above but with more entires and different values on separate lines:
awk: BEGIN{ printf "%s","19.239.211.30"" } /KBytes/ { printf "%s"," "$7} END{ print "" }
awk: ^ unterminated string
^CWaiting for server threads to complete. Interrupt again to force quit.
Yet it works from the shell outside of the bash script.
awk: BEGIN{ printf "%s","19.239.211.30"" } /KBytes/ { printf "%s"," "$7} END{ print "" }
awk: ^ unterminated string
^CWaiting for server threads to complete. Interrupt again to force quit.
This is the problem I believe
Code:
"$7
---- Add ----
Also, the extra quote seems wrong:
Code:
"19.239.211.30""
Last edited by konsolebox; 07-08-2013 at 08:12 PM.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.