LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 01-12-2015, 07:36 AM   #1
cristi32
LQ Newbie
 
Registered: Feb 2007
Posts: 8

Rep: Reputation: 0
Unhappy howto make a script for a log file


Hello everibody,

i have a goal and i dont know how to make the script. i have a very-very big /var/log/auth.log where i have all the logging on trials on my server.

i want to make a file beggining from the log file with only one line for every IP which attack my server. in the log file i have hundred or thousand of lines with trials from the same IP , but the lines are different because the attacker try logging on my server with different users or passwords .

i want to obtain a file with only one complete line from each IP which try to log on my server .

please help
 
Old 01-12-2015, 07:53 AM   #2
veerain
Senior Member
 
Registered: Mar 2005
Location: Earth bound to Helios
Distribution: Custom
Posts: 2,524

Rep: Reputation: 319Reputation: 319Reputation: 319Reputation: 319
Can you make a common pattern in multiple lines? So that the log can be reduced. Please show your log?
 
Old 01-12-2015, 08:29 AM   #3
cristi32
LQ Newbie
 
Registered: Feb 2007
Posts: 8

Original Poster
Rep: Reputation: 0
i can't show my log. it has 380 Mb . of course i have common patterns in multiple lines . the patterns are the IP . I DONT UNDERSTAND WHAT YOU DONT UNDERSTAND . i hope that i have been very clear in my explanations.


i want to make a file which retain only one line with IP 144.0.0.34, doesnt matter which line .UNDERSTAND ???????

...............
Dec 31 15:07:33 ftp1 sshd[16148]: Failed password for root from 144.0.0.34 port 36383 ssh2
Dec 31 15:07:33 ftp1 sshd[16148]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=144.0.0.34 user=root
Dec 31 15:07:33 ftp1 sshd[16148]: PAM service(sshd) ignoring max retries; 6 > 3
Dec 31 15:07:34 ftp1 sshd[16150]: Failed password for root from 144.0.0.34 port 38532 ssh2
Dec 31 15:07:35 ftp1 sshd[16154]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=144.0.0.34 user=root
Dec 31 15:07:36 ftp1 sshd[16152]: Failed password for root from 144.0.0.34 port 39251 ssh2
Dec 31 15:07:37 ftp1 sshd[16150]: Failed password for root from 144.0.0.34 port 38532 ssh2
Dec 31 15:07:38 ftp1 sshd[16154]: Failed password for root from 144.0.0.34 port 41230 ssh2
Dec 31 15:07:38 ftp1 sshd[16158]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=144.0.0.34 user=root
Dec 31 15:07:38 ftp1 sshd[16152]: Failed password for root from 144.0.0.34 port 39251 ssh2
Dec 31 15:07:39 ftp1 sshd[16150]: Failed password for root from 144.0.0.34 port 38532 ssh2
Dec 31 15:07:40 ftp1 sshd[16154]: Failed password for root from 144.0.0.34 port 41230 ssh2
Dec 31 15:07:40 ftp1 sshd[16158]: Failed password for root from 144.0.0.34 port 44098 ssh2
Dec 31 15:07:41 ftp1 sshd[16150]: Failed password for root from 144.0.0.34 port 38532 ssh2
Dec 31 15:07:41 ftp1 sshd[16150]: PAM 5 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=144.0.0.34 user=root
Dec 31 15:07:41 ftp1 sshd[16150]: PAM service(sshd) ignoring max retries; 6 > 3
Dec 31 15:07:42 ftp1 sshd[16154]: Failed password for root from 144.0.0.34 port 41230 ssh2
Dec 31 15:07:42 ftp1 sshd[16158]: Failed password for root from 144.0.0.34 port 44098 ssh2
Dec 31 15:07:44 ftp1 sshd[16152]: Failed password for root from 144.0.0.34 port 39251 ssh2
....................
 
Old 01-12-2015, 08:50 AM   #4
TenTenths
Senior Member
 
Registered: Aug 2011
Location: Dublin
Distribution: Centos 5 / 6 / 7
Posts: 3,475

Rep: Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553
Ok, so what have you tried so far?
 
Old 01-12-2015, 09:00 AM   #5
cristi32
LQ Newbie
 
Registered: Feb 2007
Posts: 8

Original Poster
Rep: Reputation: 0
nothing. i dont know how to drop all the lines , except only one , unconcerned which.
 
Old 01-12-2015, 09:03 AM   #6
TenTenths
Senior Member
 
Registered: Aug 2011
Location: Dublin
Distribution: Centos 5 / 6 / 7
Posts: 3,475

Rep: Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553Reputation: 1553
You can do what you are trying to achieve with a combination of:

Code:
grep
awk
sort
uniq
Do some reading on those.

Hints:
grep - to find just the lines that have failed.
awk - to get just the IP address out of the line.
sort - to sort the IP addresses in order.
uniq - to show just unique values in a sorted list.
 
1 members found this post helpful.
Old 01-12-2015, 09:24 AM   #7
cristi32
LQ Newbie
 
Registered: Feb 2007
Posts: 8

Original Poster
Rep: Reputation: 0
i dont know how.
i want a script to parse the file from the beggining and when it view an IP , it keep the first line with this IP, all other lines with the same IP must be deleted . the next IP, the same thing like with the first IP.

please help
 
Old 01-12-2015, 09:30 AM   #8
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 26,636

Rep: Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965Reputation: 7965
Quote:
Originally Posted by cristi32 View Post
nothing. i dont know how to drop all the lines , except only one , unconcerned which.
Read the man pages on the sort, uniq, and grep commands. If you're looking for only ONE IP address, use grep with the -e flag to return ONLY the lines with that address in them.

If you want to pull out each line with unique IP addresses, sort the file first by using the IP field. From there, run it through uniq, which will get you one of each line. If you're new to scripting, there are many good bash tutorials that can help you get started, especially if you use the man pages for the above commands too. You don't say how often this needs to run, but personally, I'd use logrotate to prune that file every so often, to keep it a better size, which will make the script easier to run. I'd also probably 'chain' the commands together to produce a temporary file, that your script can operate on. Something like
Code:
grep -e /some/log/file 11.22.33.44 > /some/output/file
...or....
Code:
grep -e /some/log/file "Failed password for" | sort --key 10 | uniq  -f 10 -w 15 > /some/output/file
The man pages for the commands will tell you what those options do. That will give you an output file containing ONLY one line per unique IP address, which is MUCH smaller than your initial log file.
Quote:
Originally Posted by veerain
Can you make a common pattern in multiple lines? So that the log can be reduced. Please show your log?
Again, are you not reading the questions?? The OP clearly asked HOW TO DO THIS.
 
Old 01-13-2015, 06:59 AM   #9
rtmistler
Moderator
 
Registered: Mar 2011
Location: USA
Distribution: MINT Debian, Angstrom, SUSE, Ubuntu, Debian
Posts: 9,882
Blog Entries: 13

Rep: Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930Reputation: 4930
Quote:
Originally Posted by cristi32 View Post
i dont know how.
i want a script to parse the file from the beggining and when it view an IP , it keep the first line with this IP, all other lines with the same IP must be deleted . the next IP, the same thing like with the first IP.

please help
@cristi32, you realize that you're asking someone to just jump up and do effort for you.

That's not the idea of LQ or really any forum like this. People are here of their own accord, not paid, and not obligated to take on efforts, and especially not when you repeatedly write "please help". (And if that phrase is part of your signature, remove it)

People will however be very detailed and very helpful if you start on your own, demonstrate that you are not asking someone to do all the work for you, but instead demonstrate that you are willing to participate in your own answer.

My read on this is either of (1) you don't know how to start at all, or (2) you're unwilling to start and would rather just have someone do this all for you.

As are as option #2 goes, TenTenth's point is highly applicable here:
Quote:
Originally Posted by TenTenths View Post
Ok, so what have you tried so far?
Assuming you're willing to give option #1 a go, my recommendation is ultimately to write a BASH script.

First follow the advices of TenTenths and TBOne and determine what commands you'd use on your log file to make the determinations which will selectively find the information you want, try those commands out, try some variations on those commands.

Once you've determined how to do this on the command line, then you can write those commands into a script.

There are many resources available regarding how to script. I have a blog entry which covers some of this as well, and I feel an important point I make in that entry is that "everything you can do on a regular command line, you can write in a bash script". Further, if you check the links in my signature, and probably many other's, you'll see reference links for bash scripting.

As far as needing help with that, if/when you post things to the effect of "I've tried <command example> on <example data>, and I got <some-result>, however I really wanted to get <desired-result>" then people will be more willing to assist you.

Another part of all this is that the inquirer may not always know exactly what they want, or once they get some of what they want, their need for further refinement grows exponentially. So as a result, you say "I need to find all occurrences of the letter "e". Someone responds "blah-blah", done! Next you say "I need to find all occurrences of the letter "e" only when followed by a "th", and only on even numbered lines, in files which were made the third Tuesday of any given month, on a leap year." Wow! Right? Yes an exaggeration, but that is sometimes what ends up happening.

Therefore it's best that you participate in your own solution so that when it invariably comes time to modify that solution, you understand how the solution is being executed and you can intelligently steer it towards the modifications which best suit you.
 
1 members found this post helpful.
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Shell script to monitor the log file & kill the process if log is not updating. milu_k Programming 5 07-19-2012 08:23 AM
Firewall log file, how to make several different log files with IPTables? newtovanilla Linux - Newbie 5 11-28-2008 12:39 PM
samba howto log every file modification? j0hnd0e Linux - Networking 0 12-06-2007 05:03 AM
any ideas to reduce log file size or make log file size managed? George2 Programming 2 08-13-2006 06:55 AM
HOWTO Log File Transfers Sizes with IMAP pjcp64 Linux - Software 0 06-28-2005 12:29 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 11:29 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration