Apache2 Server attacked all day long and can't block it
Hello.
Lately, our server have been experiencing some sort of attack from google's ip addresses. I suspect this might be a SYN or DDos attack and can't control it is getting out of hand with thousands and thousands requests daily, freezing up the server with CPU usage over 100. We have a squid "accelerator" on our web server and looking at the log I see a LOT of these: Quote:
I've tried some "solutions" but, have not work properly: - Have decreased Apache's Timeout to 10 - MaxKeepAliveRequests 200 - KeepAliveTimeout 6 I've even tried blocking SYN flood to port 80 but, it drops connections from real users: $iptables -N no-syn-flood $iptables -A no-syn-flood -m limit --limit 1/s --limit-burst 5 -j RETURN $iptables -A no-syn-flood -j DROP $iptables -I INPUT -p tcp -i $EXT_IFACE --dport 80 --syn -m state --state NEW -j no-syn-flood Temporarily, I've dropped all traffic from that subnet and things are back to normal: $iptables -A INPUT -s 66.249.75.0/255.255.255.0 -j DROP What other measure you recommend? Thanks in advanced for your time and help. |
Code:
$iptables -A no-syn-flood -m limit --limit 1/s --limit-burst 5 -j RETURN Have you tried debugging it with something like the below and seeing how easy it is to trigger? Code:
$ipt -A no-syn-flood -p tcp -j LOG --log-prefix "OVER_USAGE " Code:
$ipt -A WEBSERVER -p tcp -m limit --limit 25/minute --limit-burst 120 -j ACCEPT What kind of content are you serving? static? php? I might suggest going to nginx instead of apache. After I switched from apache to nginx + php5-fpm I could handle a lot more traffic for less resources. |
Quote:
Code:
host 66.249.75.191 |
Thanks for replying.
I have a server hosting 8 websites under Joomla but, only one of these get a lot of traffic. After blocking: Quote:
Quote:
I have nginx installed and was thinking of giving it a chance to test it, I guess that would be a good option. |
Google have been, for some time now, able to execute javascript as a client so probably they are interested in your site and they are doing more than just crawling static content. Also I forget how many blocks of IPs google have but it is a lot. So don't be surprised if googlebot comes around again soon.
|
This may also be useful to manage crawl rates.
https://support.google.com/webmasters/answer/48620 After reflecting, this is what crawlers do. Otherwise your content will not show up on google. I grabbed my most recent ones to display http://pastebin.com/xx63dq9x This barely touches the performance of my server but it does make quite a few requests Also, even with thousands and thousands of requests a day, that still only amounts to maybe like 166/hr (at 4k/day) which definitely should not be enough to cause your system to lock up. You should still be able to use your site even with 10-ish concurrent users making connections (albeit slower). Use something like ab to stress test your server? Just out of curiosity, what is your ram usage like? How much do you have and how much is being used with "higher levels of traffic"? It's possible you're running out of ram and going into excessive swapping Just speculation at this point though, too little info to go on. |
Quote:
Code:
nslookup -q=TXT _netblocks.google.com 8.8.8.8 Code:
ip4:64.18.0.0/20 ip4:64.233.160.0/19 ip4:66.102.0.0/20 ip4:66.249.80.0/20 ip4:72.14.192.0/18 ip4:74.125.0.0/16 ip4:108.177.8.0/21 ip4:173.194.0.0/16 ip4:207.126.144.0/20 ip4:209.85.128.0/17 ip4:216.58.192.0/19 ip4:216.239.32.0/19 |
We have 8G of RAM on that server and no theres no swapping going on, the most I've seen the memory usage rate is at 2.5G, just high cpu usage by apache and mysql since, there is a lot of request for the websites index page.
So, blocking those addresses will do more harm because our websites won't be on google? |
Quote:
What you do with it is up to you. |
Quote:
|
If you block google spiders, how does google know what is on the site?
This is your server and you're asking how to block them and suggestions have been made... but my thinking is there is a config issue causing high cpu usage after a certain threshold of traffic that should be warranting your attention and investigation. Since you have high levels of RAM, my next guess would be related to the processes handling the requests (mpm-worker and such) Or maybe just search it -> https://duckduckgo.com/?q=apache+hig...age&t=lm&ia=qa |
Quote:
In post #1, the OP mentions CPU usage, in #4 we are shown loadavg - so which is it that the OP is worried about ?. These are not the same - especially in Linux, and more especially where Apache is involved. |
8GB of RAM is not a lot for a server running apache/mysql and serving images. In your case more RAM would be money well spent.
What OS are you running and is it up to date? Have you grabbed performance data from the mysql db? See what's hot. Is it possible you have something misconfigured? How well do you know apache/mysql? Are you using SSD or HDD? What filesystem? What mount options? What forum software are you running? Is it up to date? Can you check you apache logs to see what the googlebot was doing? It's good to block the ips now so you can keep the system going but like others have said you need to find the cause. |
Quote:
Code:
Crawl-delay End Trasmission. |
Hello.
I appreciate the time all of you have taken to assist me with my server's problem. Answering questions: The server is running Debian 7.8 (Wheezy) and everything is up to date as of yesterday. This is a dedicated server we contracted two years ago for hosting our websites only. As mentioned before, we have several websites running Joomla but, one of these have a lot of traffic with over 20000 articles. We do not have any panel or forum software installed. I believe there might be some misconfiguration with apache and mysql since I've seen these processes at 99 cpu usage. The server also have squid installed as an accelerator. Here's apache2.conf doing virtal hosting: Quote:
Quote:
Quote:
|
All times are GMT -5. The time now is 12:44 AM. |