Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I want to use Analog to analyze the volume of traffic on a remote Apache Server. The problem is that the log file (access_log.07-25-03 in this case) is too big & takes too long to download (& it's the smallest log file in there at 16.9MB - other log files (access_log & access_log.1 through 4) are upto 1.99GB - must be cumulative monthly files).
On the Win2K Servers, I run Analog on files that are created every half hour like ex03072422.log- the retrievals are fast, & I can spot & block abusers fairly quickly. I need the same efficiency on the remote Linux box. I was thinking I could setup an hourly CRON job to use logrotate to write hourly files. Currently it looks like the shortest interval is daily.
The other question is... once I am able to analyze abusers on the Linux box, how do I block them? On the Win2K servers we block specific user IP's, etc. with BlackIce. Not sure if the Linux box would require additional software or if there's something built in.
In any case, our developer who setup that box is too busy, & as you can tell I'm pretty clueless as to how to implement what should be some fairly simple changes/additions.
Any help would be much appreciated. Thanx in advance - John
I'm not sure what would be the right answer for you but I'll tell you what I do and maybe it'll spark an idea or two.
My apache logs grow to about 130meg daily. That's right daily! I rotate them each night and run webalizer against them. This works pretty well for giving me fairly nice reports based on many different criteria.
One problem I have is a lot of "GET" and "POST" in my urls as well as we use a sessionID for tracking a user while logged in to the site. These lines appear to webalizer as unique URLs and I have to parse the files before running them through webalizer otherwise it makes really huge "current" file and consumes all memory as it runs daily.
I'm not sure what you could use to dynamically nab IP's and block abusers but as far as generating usage reports give webalizer a shot. It's easy to configure and run through cron.
You need the quotation marks
Thats the pipe symbol at the start after the opening quotation mark
The path to the location of rotatelogs comand may differ on your system
And obviously where you want to store the log
1800 is the number of seconds everytime it'll be run - 30 minutes
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.