Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
i have 101 files in /var/log
That number may include logrotated ones, not all used logs, right?
can not keep track of them what would you recommend?
Something like Logwatch. Configure and cronjob it.
I am looking for the failed logins what the best way of searching for these logs?
For SSH implement one of http://www.linuxquestions.org/questi...d.php?t=340366 and you won't have to worry about that. PAM-enabled services log to syslog (also see there's pam_tally), Sudo logs to syslog as well and then there's last/lastb for looking at wtmp (login records).
Nicely formatted answer!
What exactly do you mean by saying that? My formatting-fu rules major?
And what valuable insights does this add to answer the OP's questions?
BTW: That number may include logrotated ones, not all used logs, right? you would find out by doing something like:
i found that this ip 62.193.230.48 tried to login a lot of times so can not block him
That does not compute.
Even if you're on the same subnet you could block it: depends on if what you're running needs to be publicly accessable. If it does not you could use a more restrictive "deny from all, allow from some_nets" ruleset. In any case please spend some time hardening your box and see my previous note about SSH.
As for logs should i clear them after i email myself them? or do you not recommend this?
Unnecessary unless you didn't install logrotate or configuring logrotate to not delete logs or save a nonsensical amount of rotations.
few followup questions that have arose from viewing these logs
is it commen that have around 30 Illegal users a day try to login to your servers?
Also whats the difference between Failed logins and illegal users?
Quote:
Failed logins from these:
asdf/keyboard-interactive/pam from ::ffff:83.70.232.2: 1 Time(s)
helloworld/keyboard-interactive/pam from ::ffff:83.70.232.2: 1 Time(s)
Illegal users from these:
adm/none from ::ffff:83.227.4.103: 1 Time(s)
admin/none from ::ffff:61.90.197.55: 6 Time(s)
admin/none from ::ffff:83.227.4.103: 2 Time(s)
.... lots lots more
Also commands are being run i no nothing about
Quote:
Commands Run:
User amavis:
... Command ...
but i have deleted amavis and if so how how do i stop this from running?
is it commen that have around 30 Illegal users a day try to login to your servers?
Some nets are more exposed than others, so yeah, not uncommon.
Also whats the difference between Failed logins and illegal users?
"Failed" is like wrong pass, "illegal" is like unknown or disabled accountname.
Also commands are being run i no nothing about
Well, then familiarise yourself with what accounts are available on the machine.
Typing "getent passwd amavis" should show it's an inert system account for running Amavis AV.
but i have deleted amavis and if so how how do i stop this from running?
If you deleted the account w/o uninstalling the software then you made an error. And what exactly is running?
Might be a little of topic but I created this backup script a few years ago that might come in handy to backup all those log files to a cdrom, it may just need some tweaking to get it right, I haven't used it for quite a while.
Code:
#!/bin/sh
###########################################################################
Backup_Dirs="/var/log"
Backup_Dest_Path="/tmp/backup"
Backup_Date=`date +%a%b-%d-%Y-%R`
Backup_Name="Hostname-domainname"
Speed="8" # Use best speed for CD-R/RW disks on YOUR system
MAILTO="someone@somewhere.com"
###########################################################################
# Check to see of backup directory exists, if not then create it
function makeBackupDir {
if [ ! -d $Backup_Dest_Path ]; then
mkdir $Backup_Dest_Path
chmod 700 $Backup_Dest_Path
fi
}
makeBackupDir
###########################################################################
# Create tar file with todays Month Day Year prepended for easy identification and also create a log file
function tarBackupDir {
echo "Start creating backups including log files"
tar -cvzpf $Backup_Dest_Path/$Backup_Name-$Backup_Date.tar.gz $Backup_Dirs > $Backup_Dest_Path/$Backup_Name-$Backup_Date.log
echo "Finished creating backups including log files"
}
tarBackupDir
###########################################################################
# Create a image that can be written to writeable media
function makeImage {
echo "Making image file"
#mkisofs -R -o /tmp/$Backup_Name-$Backup_Date.img $Backup_Dest_Path
mkisofs -l -r -J -V $Backup_Name-$backup_Date -o /tmp/$Backup_Name-$Backup_Date.iso $Backup_Dest_Path
echo "Finished making iso file"
}
makeImage
###########################################################################
# Check size of backup image be for we burn it
function checkSize {
echo "Check size of file before burning to disc"
SIZE=`du -m /tmp/$Backup_Name-$Backup_Date.iso | awk '{print $1}'`
if [ "$SIZE" -lt "700" ]; then
echo "Size OK to burn"
else
echo "Size too big to burn"
exit 1
fi
echo "Finished size checking"
}
checkSize
###########################################################################
# Burn to disc
function burnImage {
BURNER=`cdrecord dev=ATAPI -scanbus | grep "'" | awk '{print $1}' | grep "0"`
echo "Burning back-up to disc."
cdrecord dev=ATAPI:$BURNER -v blank=fast -eject fs=64M driveropts=burnproof speed=$Speed -sao /tmp/$Backup_Name-$Backup_Date.iso
echo "Successfully burnt: $Backup_Name-$Backup_Date.iso to disc"
}
burnImage
###########################################################################
# Mail a backup notice to someone
function mailTo {
echo "Mail sent to $MAILTO"
mail -s "$Backup_Name-$Backup_Date: $Size MB : Backup Complete" $MAILTO < /dev/null
}
mailTo
###########################################################################
# Lets clear out some old backups that are older than 7 days
function cleanUp {
find $Backup_Dirs -type f -mtime +7 -exec rm -f '{}' \; #Delete logfiles older than 7 days (After backup)
find $Backup_Dest_Path -type f -exec rm -f '{}' \; #Delete files in backup directory
}
cleanUp
###########################################################################
exit 0
###########################################################################
but i have deleted amavis and if so how how do i stop this from running?
If you deleted the account w/o uninstalling the software then you made an error. And what exactly is running?
this what is running
test -e /usr/bin/sa-learn && test -e /usr/sbin/amavisd-new && /usr/bin/sa-learn --rebuild >/dev/null 2>&1: 8 Time(s)
Also is it commen that i should get this and should i be worried?
backup : 3 Time(s)
bin : 1 Time(s)
bind : 2 Time(s)
..alot of names tried they are all blocked form login in root : 453 Time(s)
spam : 3 Time(s)
sshd : 4 Time(s)
sys : 2 Time(s)
www-data : 3 Time(s)
and is there anyway that i can find out what passwords these people are trying?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.