Bash script to put log files into single file and email
ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Bash script to put log files into single file and email
I am looking for a bash script that will take my log files, put them into a single file, and email that file to my email account. This is what I have so far, but it doesn't work, and besides I know there is a more effective way to do the same thing.
Ok, I admit I feel like a complete and utter moron for that one. But now for the next hump:
Code:
Collecting Logs
grep: 8: No such file or directory
grep: 8: No such file or directory
grep: 8: No such file or directory
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
grep: 8: No such file or directory
grep: 8: No such file or directory
grep: 8: No such file or directory
grep: 8: No such file or directory
Sending Mail
If I run with -v (for verbose) I get:
Code:
#!/bin/bash
# script to send logs to email
LOGDATE="Oct 8"
APACHEACDATE="08/Oct"
MYFILENAME="$HOME/systats"
GREP="/usr/bin/grep"
echo 'Collecting Logs'
Collecting Logs
echo '/var/log/secure' > $MYFILENAME
cat /var/log/secure | $GREP $LOGDATE >> $MYFILENAME
grep: 8: No such file or directory
wait
echo '/var/log/messages' >> $MYFILENAME
cat /var/log/messages | $GREP $LOGDATE >> $MYFILENAME
grep: 8: No such file or directory
wait
echo '/var/log/maillog' >> $MYFILENAME
cat /var/log/maillog | $GREP $LOGDATE >> $MYFILENAME
grep: 8: No such file or directory
wait
echo '/var/log/apache/access_log' >> $MYFILENAME
cat /var/log/apache/access_log | $GREP $APACHEACDATA >> $MYFILENAME
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
wait
echo '/var/log/apache/error_log' >> $MYFILENAME
cat /var/log/apache/error_log | $GREP $LOGDATE >> $MYFILENAME
grep: 8: No such file or directory
wait
echo 'dmesg' >> $MYFILENAME
dmesg >> $MYFILENAME
wait
echo '/var/log/secure' >> $MYFILENAME
cat /var/log/secure | $GREP $LOGDATE >> $MYFILENAME
grep: 8: No such file or directory
wait
echo '/var/log/syslog' >> $MYFILENAME
cat /var/log/syslog | $GREP $LOGDATE >> $MYFILENAME
grep: 8: No such file or directory
wait
echo '/var/log/vsftpd.log' >> $MYFILENAME
cat /var/log/vsftpd.log | $GREP $LOGDATE >> $MYFILENAME
grep: 8: No such file or directory
wait
echo 'Sending Mail'
Sending Mail
mail -s "Sys Stats for Today" user < $MYFILENAME
#rm $MYFILENAME
If that helps in figuring out why it wont work... I know it has to do with the number 8 in the date... but why does it work if you simply put in in quotes outside this script?
If you read this post before, I DID NOT fix the problem. I only eliminated the error messages by putting '''s around the $LOGDATE 's. It still wont output the data to the file. Any ideas?
Piping the output from cat to the grep command is unuseful because grep itself is able to process the content of a file (not only the standard input as the one coming from the cat command).
The errors you obtain are related to the interpretation of the LOGDATE and APACHEACDATE variables. As you already discovered the quotes are useful to let the shell interpret a string literally, despite of the presence of special characters or blank spaces, but you should put every variable inside quotes when you use it. In other words, it is not enough to put quotes only in the variable assigments at the beginning of the script. Every time the shell encounter a variable, expands it whitout preserving the surrounding quotes. So, the line
since you defined APACHEACDATE at the beginning. In this case the unassigned APACHEACDATA is expanded as a null string and the resulting command is grep without any pattern to search.
Just a note about debugging: executing the script by "bash -x" will give the execution trace with all the variable expansions performed by the shell. The output can be difficult to read, but it gives many more debugging info. Another hint is to look for differences in the usage of single quotes ' and double quotes ".
As stated above you should run the script with bash -x scriptname.sh and you will see a lot of errors (if there is any). I used too think that using the -x option didn't help me much...but now I use it all the time for debugging...
-custangro
ps I like using ksh...just my preferance...
The wait statements in the OP are not necessary. wait will pause until background jobs have completed, but you are not running the cat commands in the background.
i.e. this might make sense:
Code:
command1 &
command2 &
wait
command3
...it will allow command1 and command2 to operate in parallel, and the wait statement will halt the progress of the script until they are both done, and only then proceed with command3.
In the case where you are appending files somewhere, running in parallel is probably not what you want to do, unless you want all the data mixed up.
Something else which may be of interest to you: the more command, when called with multiple input files named as arguments, and re-directed to a file will prepend the content of each file with
Code:
::::::::::::::
filename
::::::::::::::
So, as long as you don't mind the ::::::: bits, you can accomplish much the same thing as the intent of the OP, just like this:
Code:
more /var/log/secure /var/log/messages /var/log/maillog ... > "$MYFILENAME"
The shell also allows for some flexibility in formatting of your code - you can put a backslash (\) as the last character on a line, and the shell will ignore the line break. This can greatly help the readability and maintainability of your scripts. For example, the command above is equivalent to this:
There should be double quotes around the $LOGDATE part.
Yup, Colucix suggested that before. Plus a preferable invocation (saving the waste of typing and forking 'cat').
DragonM15,
to make the script more generic, you may want to generate the date you're looking for, e.g.,
Code:
LOGDATE=`date -d yesterday '+%b %e'`
This way you shouldn't need to change it daily.
Also beware that when the log is rolled, you might miss a part. E.g., my /var/log/messages gets rolled every Sunday at 4:02; presumably the FC2 default. In that case, some relevant messages may be in /var/log/messages.1. You could add that as another argument to grep , e.g.,
I'd put it as the first file, right after the pattern, to preserve chronological order.
If you're willing to change the structure, you could reduce code duplication by using a loop (and curly braces):
Code:
for log in /var/log/{secure,messages,maillog,apache/error_log,syslog}; do
echo $log >> $MYFILE
grep -h "$LOGDATE" $log{.1,} >> $MYFILE
done
The apache access log and dmesg would have to be done separately.
BTW, if you'd like a ready-made solution, look at LogWatch. It's running on my machine, and it does a decent job of summarizing all noteworthy log entries in a daily e-mail.
This wouldnt do what I want it to do.... yes it would give me the log files, but it would give me ALL of the log files... I just want it a day at a time. Isn't that an organized way of doing things?
This wouldnt do what I want it to do.... yes it would give me the log files, but it would give me ALL of the log files... I just want it a day at a time. Isn't that an organized way of doing things?
Thanks,
DragonM15
well thanks for the appreciation.
Awfiully sorry I haven't got time to sit there and work out a perfect solution,
but it was a pointer to an easy way to mail log files.
surely you could work out how to limit it to
today's log file only?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.