Fgrep would be helpful. And you could use this pattern to extract lines from multiple files along with multiple keywords:
Code:
fgrep -e keyword1 -e keyword2 -e keyword3 ... -- file1 file2 file3
In bash you could make use of read -a to get keywords:
Code:
#!/bin/bash
read -a KEYWORDS
Then process the contents of KEYWORDS to be usable by fgrep:
Code:
PATTERNS=()
for A in "${KEYWORDS[@]}"; do
PATTERNS=("${PATTERNS[@]}" -e "$A") ## PATTERNS+=(-e "$A") for newer shells but with less compatibility.
done
From there you can now call fgrep.
Code:
fgrep -H "${PATTERNS[@]}" -- "$@" ## "$@" should hold the files passed as an argument to the script; -H makes sure that the filename is shown even with single files
And the complete form of it would be:
Code:
#!/bin/bash
read -a KEYWORDS
PATTERNS=()
for A in "${KEYWORDS[@]}"; do
PATTERNS=("${PATTERNS[@]}" -e "$A")
done
fgrep -H "${PATTERNS[@]}" -- "$@"
If you want, you could also let your script accept both keywords and files as arguments but are separated with -- keyword.
Code:
#!/bin/bash
PATTERNS=()
FILES=()
while [[ $# -gt 0 ]]; do
case "$1" in
--)
shift
while [[ $# -gt 0 ]]; do
FILES=("${FILES[@]}" "$1")
shift
done
;;
*)
PATTERNS=("${PATTERNS[@]}" -e "$1")
;;
esac
shift
done
if [[ ${#FILES[@]} -eq 0 ]]; then
echo "No file was entered."
exit 1
elif [[ ${#PATTERNS[@]} -eq 0 ]]; then
echo "No pattern was entered."
exit 1
fi
fgrep -H "${PATTERNS[@]}" -- "${FILES[@]}"
And run the command like:
Code:
[bash] script[.sh] abc def ghi jkl -- /var/log/x /var/log/y
Lastly in a more advanced manner, you could use exec to call fgrep so that you could just replace the shell's process with the command.
Code:
exec fgrep -H "${PATTERNS[@]}" -- "${FILES[@]}"
With this we assume that you wouldn't call the script with . or source which is never likely is it?
If you find the output with long lines hard to read, you could use less -S: