ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Hello,
I've got a program that creates its own log file in normal text file, what i would like to be able to do is create another bash script that could perhaps sort through the log file and pick out important lines of it depending on the test string contained on the line. Rather like how swatch works but i'd like my own independent script. Its not something that will run constantly or on a loop just something that the user can run every now and then and it pulls in the current log file and re-creates it only showing specific strings i've listed in another file for example.
does anyone know of a script already out there that could customise to my own needs, or will i need to start from scratch, probably using grep's etc etc.
Sounds like a job for Bash and Awk. If you have interest in a number of log files (1 or more), identify them in a selection file and in a loop have Awk create/append to your own chosen extract file. Awk is a super choice for selecting from your logs depending on pattern matches and formatting the output. When your loop of 1 or more files completes, head up and tail the extract with the appropriate email information and pipe the lot into xmail or what-have-you. after which you can either archive or delete your extract file ready for the next run. If you want you could have it run each day or once a week etc using at and just have it arrive in time for reading with your coffee. The possibilities are fairly endless and could save a lot of trawling through logs if you know what it is that you are interested in and if the logs have unique tags to allow particular types of event to be pulled out easily. Happy hacking.
yep thats all rather along the lines what i was thinking, sorry haven't really taken a look at perl yet (not had the need to), and logwatch i also know of but don't require. But i didn't know of a decent way to script it. I suppose its a grep issue i have, i'd like to read in each line of text output in a text file, and if a string of text on a line is a match of any of the strings of text i have in another file that i'd like the text to be compared against, then for it to consider that as a match for that line of text copy it to another file and then check the next line of text for other matches from the other file, if a match again print line to other file.
Something like / similar to this would be amazing but not sure if grep can handle it, might need the use of sed as well???
yep thats all rather along the lines what i was thinking, sorry haven't really taken a look at perl yet (not had the need to), and logwatch i also know of but don't require. But i didn't know of a decent way to script it. I suppose its a grep issue i have, i'd like to read in each line of text output in a text file, and if a string of text on a line is a match of any of the strings of text i have in another file that i'd like the text to be compared against, then for it to consider that as a match for that line of text copy it to another file and then check the next line of text for other matches from the other file, if a match again print line to other file.
Something like / similar to this would be amazing but not sure if grep can handle it, might need the use of sed as well???
Can you please provide an example of your input and expected output.
#filename: log.txt
words and stuff problem found
blah blah un-interesting
blah
blah1
blah2
other words problem found error 2 string appears on line so should be copied to other log file
yep same again another error ahh haa
hey ho what you know blah blah
other log blah
error code 4 more problems
thats an example of log file obvious not the exact one but still you can get the idea.
Now a file of text strings to match against.
so if any string of text on a line in log.txt matches any of the text strings from matchtext.txt, i'd like to be able to take the line thats matched and copy paste so to speak, probably more a redirect to another output file (another log file) called sivedog.txt
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.