Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I am able to get a stream of log information from a firewall device using:
Code:
nc -lu -p 514 | tee mylogfile
I can redirect that output to a file, as shown, without problem. This stream has no newlines, but each new "event" begins with "<13". What I want to do is to 'sed' that to insert newlines so I can do some downstream processing. if I cat the redirected file to a sed command:
Code:
cat mylogfile | sed -e $'s/<13.> */\\n&/g'
it works fine. What is not working is:
Code:
nc -lu -p 514 | sed -e $'s/<13.> */\\n&/g'
The command above produces absolutely nothing even after letting it run for 7 hours, whereas the nc command in the first example produces a continuous stream of data.
I've found other postings on the Internet which suggest using unbuffer and stdbuf (e.g. unbuffer nc -lu -p 514), neither of those ideas produce any output either.
I'm sure this must be related to the fact that there are no newlines in the stream, but even if the output buffer gets full, nothing comes out of the sed.
$ man sed
(...)
-u, --unbuffered
load minimal amounts of data from the input files and flush the output buffers more often
That didn't work either. This is the strangest thing I've ever seen. If I just do 'nc -lu -p 514', data streams immediately. If I do 'nc -lu -p 514 | tee myfile', data streams immediately AND goes to myfile. I can then post-process that file with 'sed -e $'s/<13.> */\\n&/g'' to add newlines. However, If I pipe the nc into sed ... ABSOLUTELY NOTHING! I've used your -u option which has been running for about 5 hours -- no output. I've tried using unbuffer and stdbuf, nothing seems to work.
Port 514 is syslog.
Configure your syslogd to listen to it. Maybe configure it to send it to a dedicated file. This file can be processed with sed, grep, awk, etc.
the sed [expression] is incorrect (I think). Furthermore, if there are no newlines in the stream sed probably cannot handle it, because it wanted to read a full line (=needs to read the full stream).
So better to do/use something else.
For example in awk you can use "<13.>" as input record separator and newline for output record separator.
Port 514 is syslog.
Configure your syslogd to listen to it. Maybe configure it to send it to a dedicated file. This file can be processed with sed, grep, awk, etc.
Yes, this is the syslog port. At the moment, I'm just trying to see what's coming in and what I can filter out or keep. I'll need to do some research on exactly how to get syslogd to listen and send to another file. That will possible be the next step. Or, maybe I don't need to even bother with syslogd if I can do what I want.
Quote:
Originally Posted by allend
If there are no new lines, then sed will wait till the stream is closed. e.g.
Code:
for j in {1..2}; do for i in {1..10}; do echo -n "<13i= $i j= $j"; sleep 1; done; done | sed 's/i/I/g'
but awk works if you use the record separator variable.
Code:
for j in {1..2}; do for i in {1..10}; do echo -n "<13i= $i j= $j"; sleep 1; done; done | awk '1{print}' RS="<13"
Output shows immediately and I can either send it to a file or pipe to a script to futher process. I'll experiment with this a bit, but I think this might do the trick!
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.