Apache2 Server attacked all day long and can't block it
Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Apache2 Server attacked all day long and can't block it
Hello.
Lately, our server have been experiencing some sort of attack from google's ip addresses. I suspect this might be a SYN or DDos attack and can't control it is getting out of hand with thousands and thousands requests daily, freezing up the server with CPU usage over 100.
We have a squid "accelerator" on our web server and looking at the log I see a LOT of these:
Quote:
1443712417.300 157 66.249.75.191 TCP_MISS/200 8518 GET <OURDOMAIN> - FIRST_UP_PARENT/myAccel text/html
The attack "originates" from 66.249.75.0 subnet, belonging to google company. Obviously spoofing the ip addresses.
I've tried some "solutions" but, have not work properly:
- Have decreased Apache's Timeout to 10
- MaxKeepAliveRequests 200
- KeepAliveTimeout 6
I've even tried blocking SYN flood to port 80 but, it drops connections from real users:
$iptables -N no-syn-flood
$iptables -A no-syn-flood -m limit --limit 1/s --limit-burst 5 -j RETURN
$iptables -A no-syn-flood -j DROP
$iptables -I INPUT -p tcp -i $EXT_IFACE --dport 80 --syn -m state --state NEW -j no-syn-flood
Temporarily, I've dropped all traffic from that subnet and things are back to normal:
$iptables -A INPUT -s 66.249.75.0/255.255.255.0 -j DROP
What other measure you recommend?
Thanks in advanced for your time and help.
Last edited by landysaccount; 10-01-2015 at 11:08 AM.
That's really low. A single website request will sometimes take 7-9 requests
Have you tried debugging it with something like the below and seeing how easy it is to trigger?
Code:
$ipt -A no-syn-flood -p tcp -j LOG --log-prefix "OVER_USAGE "
That adds a token slightly less then about 1 every 2 seconds and hasn't affected existing users but when I try to spam it, slows it down a whole lot after the initial 120 burst.
What kind of content are you serving? static? php? I might suggest going to nginx instead of apache. After I switched from apache to nginx + php5-fpm I could handle a lot more traffic for less resources.
Google have been, for some time now, able to execute javascript as a client so probably they are interested in your site and they are doing more than just crawling static content. Also I forget how many blocks of IPs google have but it is a lot. So don't be surprised if googlebot comes around again soon.
This barely touches the performance of my server but it does make quite a few requests
Also, even with thousands and thousands of requests a day, that still only amounts to maybe like 166/hr (at 4k/day) which definitely should not be enough to cause your system to lock up. You should still be able to use your site even with 10-ish concurrent users making connections (albeit slower). Use something like ab to stress test your server?
Just out of curiosity, what is your ram usage like? How much do you have and how much is being used with "higher levels of traffic"?
It's possible you're running out of ram and going into excessive swapping
Just speculation at this point though, too little info to go on.
We have 8G of RAM on that server and no theres no swapping going on, the most I've seen the memory usage rate is at 2.5G, just high cpu usage by apache and mysql since, there is a lot of request for the websites index page.
So, blocking those addresses will do more harm because our websites won't be on google?
Last edited by landysaccount; 10-01-2015 at 05:20 PM.
If you block google spiders, how does google know what is on the site?
This is your server and you're asking how to block them and suggestions have been made... but my thinking is there is a config issue causing high cpu usage after a certain threshold of traffic that should be warranting your attention and investigation. Since you have high levels of RAM, my next guess would be related to the processes handling the requests (mpm-worker and such)
Or maybe just search it -> https://duckduckgo.com/?q=apache+hig...age&t=lm&ia=qa
... but my thinking is there is a config issue causing high cpu usage after a certain threshold of traffic that should be warranting your attention and investigation.
Absolutely.
In post #1, the OP mentions CPU usage, in #4 we are shown loadavg - so which is it that the OP is worried about ?. These are not the same - especially in Linux, and more especially where Apache is involved.
I appreciate the time all of you have taken to assist me with my server's problem.
Answering questions:
The server is running Debian 7.8 (Wheezy) and everything is up to date as of yesterday. This is a dedicated server we contracted two years ago for hosting our websites only. As mentioned before, we have several websites running Joomla but, one of these have a lot of traffic with over 20000 articles.
We do not have any panel or forum software installed.
I believe there might be some misconfiguration with apache and mysql since I've seen these processes at 99 cpu usage.
The server also have squid installed as an accelerator.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.