Filter a single website using iptables?
Hi,
I just setup a Slackware gateway/firewall/proxy/server in a school. It has two NICs and filters all traffic using Squid and Squidguard. So far, the filter works very nice, I only have one minor problem left. Squidguard currently only filters HTTP requests, since setting up HTTPS filtering is a real PITA. Now I have only one single website left for filtering, that's https://www.facebook.com. I thought the best thing would be to block it entirely using iptables. How would I go about that? (If you wonder why I would block Facebook for students, http://www.facebook.com is still available for all students, but only outside class hours...) Cheers from the sunny South of France. |
Don't. You don't need to setup https to block by IP address, which is what you'd do in iptables. Just use a dest ip acl in squid. You just need to find all the IP addresses for Facebook.
|
Quote:
I could explain why, but it would be long. So let me restate my question. How can I block requests to https://www.facebook.com using iptables? |
Well that just sounds daft. If that's the case then you need to explain that. Arbitrary restrictions are awful.
Anyway you just block all the relevant ip's just like you would do in squid. You can't block it by name. |
Anybody able to answer a simple question without making me jump through burning loops?
|
I'm not sure that you can block by name from iptables, although I could easily be wrong. But I was under the impression that squidguard could do that, and also do it inside certain time periods. You say you are in fact using squidguard - why won't you let it block that particular site ?
(note - I might be confusing squidguard with dansguardian - I have exactly what you want running on my home server, but it's going to be another 8 hours before I get home to check !) |
iptables -A INPUT -d 1.2.3.4 -j REJECT
etc, for each IP address that facebook uses. use squid for this. |
Quote:
|
Quote:
Code:
$ dig www.facebook.com Code:
;; ANSWER SECTION: @acid_kewpie: yeah, I guess I'll follow your advice and use Squid for that. I just figured out that my downloaded blacklist database does not only rely on domain names, but also on IP addresses. My mistake. |
it'll use load's of IP's but it'll also be globally load balanced to each data center.
$ dig www.facebook.com @ns1.facebook.com ; <<>> DiG 9.8.0-P4-RedHat-9.8.0-7.P4.fc15 <<>> www.facebook.com @ns1.facebook.com ;; global options: +cmd ;; Got answer: ;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 24499 ;; flags: qr rd; QUERY: 1, ANSWER: 0, AUTHORITY: 2, ADDITIONAL: 2 ;; WARNING: recursion requested but not available ;; QUESTION SECTION: ;www.facebook.com. IN A ;; AUTHORITY SECTION: www.facebook.com. 86400 IN NS glb2.facebook.com. www.facebook.com. 86400 IN NS glb1.facebook.com. ;; ADDITIONAL SECTION: glb2.facebook.com. 3600 IN A 69.171.255.10 glb1.facebook.com. 3600 IN A 69.171.239.10 ;; Query time: 74 msec ;; SERVER: 204.74.66.132#53(204.74.66.132) ;; WHEN: Tue Sep 6 09:18:57 2011 ;; MSG SIZE rcvd: 104 you can see there that this dig says that you need to go to these glb's to get an IP address for facebook which is suitable to your geographical location. You can *never* know you have all the IP addresses. as far as your blacklist goes, check how it is being used. it would need to be used against a "dst" acl, not a "dstdomain" acl. But then going from what you've said, what are you doing with HTTPS connections? If they are already just using a CONNECT via the proxy then you can filter on the domain. It's when you're doing transparent stuff that you're SOL. |
Quote:
|
Quote:
|
Quote:
|
Yes, you can block FQDNs with iptables.
But don't think that I filter FQDNs in this way. :-) Something like this: Code:
-A OUTPUT -p tcp -m string --string facebook.com --algo kmp -j DROP |
Quote:
|
All times are GMT -5. The time now is 09:05 AM. |