LinuxQuestions.org
Did you know LQ has a Linux Hardware Compatibility List?
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices

Reply
 
Search this Thread
Old 11-19-2010, 04:00 PM   #1
VipX1
Member
 
Registered: Jun 2009
Location: Dublin, IRL
Distribution: Arch
Posts: 35
Blog Entries: 1

Rep: Reputation: 15
Question Robots looking for phpmyadmin


My phpmyadmin worked yesterday, today it doesn't.
In my logs for Apache I have lots and lots of failed attempts for incorrect incarnations of http: //my.domain.com/phpmyadmin. None of them are anywhere near my alias for the index.php but yet phpmyadmin is broken.

Is there away I can mess up robots like this. Send IP's that create multiple wrong page requests on my server back to their own IP address maybe? I would then just set thresholds to decide how strict to be.

I did try fail-to-ban before but it is cryptic. I don't have it on this particular server.
 
Click here to see the post LQ members have rated as the most helpful post in this thread.
Old 11-20-2010, 05:15 AM   #2
Noway2
Senior Member
 
Registered: Jul 2007
Distribution: Ubuntu 10.10, Slackware 64-current
Posts: 2,124

Rep: Reputation: 776Reputation: 776Reputation: 776Reputation: 776Reputation: 776Reputation: 776Reputation: 776
At first I thought this was a duplicate of your other post.

Dealing with scan robots is like pulling weeds in the garden. It doesn't matter what you do, they just keep coming back. I understand the frustration you feel, but retaliation is more likely to cause you problems as they will probably complain about you and your ISP will take action.

Fail2ban, which you referenced in your other thread is a good way to go. If you find that you have a particular IP that is doing this, you can block them (or their ISP), but this can have side effects and like the weeds, they will come up somewhere else.

I use Ossec (and snort), which will temporarily block your IP if you try to access too many invalid web pages (about 3). It looks at the apache error log and takes action accordingly.

You might also look at rate limiting in ip tables which will prevent them from trying to establish too many connections in too short a time, but again, this may interfere with your desired operation.
 
2 members found this post helpful.
Old 11-20-2010, 10:49 AM   #3
barriehie
Member
 
Registered: Nov 2010
Distribution: Debian Lenny
Posts: 136
Blog Entries: 1

Rep: Reputation: 23
Have you considered using a robots.txt file?

Code:
User-agent: *
Disallow: /
 
  


Reply

Tags
phpmyadmin, robots


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
phpMyAdmin error: client denied by server configuration: /usr/share/phpMyAdmin Kropotkin Linux - Server 2 05-15-2010 12:55 PM
phpmyadmin shows blank page; no "phpmyadmin" database in mysql qajaq Linux - Software 2 12-20-2009 12:23 PM
robots.txt paleogryph Linux - Software 1 11-11-2005 02:32 PM
configuring robots.txt jc materi Linux - Security 1 04-09-2005 10:37 AM
The Robots movie ahz General 4 03-24-2005 04:31 PM


All times are GMT -5. The time now is 07:30 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration