So I'm trying to get conky to output the last 3 different IP's that requested pages from my web server;
It all sounds rather simple doesn't it?
Well; if I used the tail command built into conky it appears I have no control over redirecting the output (at least I can't find anything to show me how to, and I've tried a few different ways)
In the end I came up with the idea of simply grepping the log file for the lines I wanted, sorting them to get only unique IP entries and then tailing the file to get the number of required lines (In my case I want 3 lines from both logs).
I've ended up with :
Code:
TEXT
${exec grep GET /var/log/apache2/access_log | awk '{print $1, $4"]",$7}' | sort -un | tail -n 3 }
$hr
${exec grep WARNING /var/log/fail2ban.log | tail -n 3 | awk '{print $2,$6,$7}'}
The code for the fail2ban works perfectly; as the IP only appears once when its banned, once when its unbanned so I didn't feel I needed to filter the information out any more than this. However the Apache log doesn't work nearly as well.
Depending if I use the -n option to sort the IPs I either get the same IP coming up 3 times (which is the last IP to access the server, but it will have requested /, /favicon.ico and other random pages also - I just want to know they hit the site, and the first thing they requested (normally just / )
I've learnt a lot so far but now it's getting late and I can't really see where to turn with this any more.
Code:
71.**.***.** - - [25/Apr/2010:21:44:57 +0100] "GET /images/tim.jpg HTTP/1.1" 200 20957
88.**.***.*** - - [25/Apr/2010:21:50:19 +0100] "GET / HTTP/1.1" 200 3386
88.**.***.*** - - [25/Apr/2010:21:50:19 +0100] "GET /default.css HTTP/1.1" 200 269
88.**.***.*** - - [25/Apr/2010:21:50:20 +0100] "GET /favicon.ico HTTP/1.1" 404 271
67.**.**.*** - - [25/Apr/2010:21:52:23 +0100] "GET /robots.txt HTTP/1.1" 404 271
If this is what appears in teh access_log, then I should just get 3 lines:
Code:
71.**.***.** - [25/Apr/2010:21:44:57] - /images/tim.jpg
88.**.***.*** - [25/Apr/2010:21:50:19] - /
67.**.**.*** - [25/Apr/2010:21:52:23] - /robots.txt
HELP!