Help with Bash script - email new error messages from several log
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Help with Bash script - email new error messages from several log
Hi,
This is completely new for me (linux and bash noob), but i got the task to write a script to monitor errors or anything that failed from several logs and for every log email separately its newly arrived errors/failures to a central administration emailaddress.
For example for the log of nginx webserver.
I got something like:
I save this as test.sh and run it in a terminal like ./test.sh
This shell script needs to be scheduled and running all the time even after a reboot.
How can i accomplish this? I had tried to let the function write to a file first and then cat the file to pass to the mail command but it just hangs in the prompt. I am creating the script in a PUTTY window connected to Linux machine.
There are a number of ways to approach this, depending on the specifics, so what precisely have you been asked to do, and in what context?
For example, if you're monitoring web applications, having a simple error handler within the application that logs to an API of separate bug logging software may be preferable than scraping logs. In particular it means you don't need to waste effort re-implementing logic of "first time error" or "existing error occurring more frequently" and similar, because it's already been written and tested, and can provide a bunch of other useful features (especially if you have multiple applications/servers).
If you must do it as a Bash/shell script, I'd still look for existing solutions - this sort of task will have a bunch of edge cases (i.e. potential bugs) that an established script will have discovered and solved already.
(If/when you do write Bash scripts, ShellCheck is a useful utility - it can't report all errors, but does highlight some common mistakes.)
You know there are programs (like the so-called ELK stack) that do this, right?
Anyway, you’d use cron to schedule it, you’d use “mail” or msmtp (depending on whether you need an smtp server) to send email, and you’d need to persist the last line you’d read in the file the previous time (probably by redirecting “wc” output to a file).
Also be aware that you have the ability to monitor files for changes (using inotify and programs built on top of it).
Last edited by dugan; 07-02-2021 at 09:23 AM.
Reason: Added the anyway
Actually we want to monitor some server/application logs for errors/warnings continuously.
As soon an error/warning occurs in a log, the new error/warning from the log since last email, should be send to a mailbox.
Things should be kept simple with a bash script (maybe per log) that monitor the corresponding log file and attach to a cron job. I am aware of existing tools, but we aren't using that in the company.
As mentioned before i am testing the script in a terminal but what i don't understand is why it keeps hanging at the prompt after following statement (see comments)
tail -n0 -F test.log | grep --line-buffered error >messagebody.txt ## <- why does it hang here??
mail -s "subject of mail" email@adres.com <messagebody.txt
I see it creates the messagebody.txt file with contents but the prompt hangs in the terminal like its waiting for input. Why does it not continue to mail command?
Simple is using existing, proven and well-tested tools.
Quote:
why does it hang here??
...
I see it creates the messagebody.txt file with contents but the prompt hangs in the terminal like its waiting for input. Why does it not continue to mail command?
$ grep --line-buffered "38" <(ping www.google.com) |
while read ; do echo found 38 ; done
well it works. If grep matches 38 in output of ping echo outputs "found 38". In case of e-email notification I guess this may work
Code:
$ grep --line-buffered "error" <(tail -n0 -F syslog.log) |
while read ; do mail ...... ; done
instead dots ... put correct format of mail command. Problem is how to be sure program is still running? You need to trap such situation. Of course it is good to rethink why it works.
Edit: This more fancy solution. I always was curious about coproc command in bash. Now I have some glimpse how it works
Code:
$ coproc grep --line-buffered "38" <(ping www.linuxquestions.org)
[2] 29186
$ while read -u ${COPROC[0]} foo ; do echo $foo ; done
64 bytes from 104.24.136.8 (104.24.136.8): icmp_seq=38 ttl=52 time=55.7 ms
64 bytes from 104.24.136.8 (104.24.136.8): icmp_seq=81 ttl=52 time=1238 ms
comparing to above leftmost part of pipe is now executed in sub-shell. But there is two-way communication through file descriptors ${COPROC[0]} - for output of grep, ${COPROC[1]} for input - in this case is redirected to <(ping www.linuxquestions.org). Another cool stuff for sure useful. But at this moment I don't have good idea for application.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.