I have a script that I wrote that searches an error log file for known errors, counts them, and then display statistics at the end. However it runs slow as molasses. I use grep and two loops to go through everything.
Here is an example of the file:
04/02/08:20:16:57 - y:\logs: 04/02/08 20:16:57.300 - No valid sum
04/03/08:05:04:38 - y:\logs: 04/03/08 05:04:38.759 - ID does not match
04/03/08:05:15:16 - y:\logs: 04/03/08 05:15:16.695 - Wrong Batch
04/03/08:05:26:41 - y:\logs: 04/03/08 05:26:41.461 - Unknown Exception
04/03/08:05:30:41 - y:\logs: 04/03/08 05:30:41.289 - I Am A Bad Error
04/03/08:06:00:58 - y:\logs: 04/03/08 06:00:58.633 - Wrong Batch
04/03/08:06:00:58 - y:\logs: 04/03/08 06:00:58.633 - Wrong Error
04/03/08:06:00:58 - y:\logs: 04/03/08 06:00:58.633 - Unknown Exception
04/03/08:06:00:58 - y:\logs: 04/03/08 06:00:58.633 - I Am A Bad Error
Now what I have is a list of acceptable errors:
"No valid sum"
"ID does not match"
When the script is run, I'd like an output file something like
1 No valid sum
1 ID does not match
2 Wrong Batch
2 Unknown Exception
2 I Am A Bad Error
1 Wrong Error
The bad errors can be anything that is not in the okerror array. I just think that someone here could do something better than what I have, as it almost takes a second per line. I was thinking something along the lines of "grep -f" or something, but I just can't come up with something very elegant.