iptable rules to block https://www.facebook.com
1 Attachment(s)
Hi..everybody.
I am using squid-3.1.14 as transparent proxy on ubuntu 11.10. Everything is working fine. I have blocked facebook. But the users still can access using https. I searched in google to block https with squid but ended with no use. I thought of this is the best site to discuss because many of my problems have been solved with this site. I really hats off to every member of this site. Please find the attachment in which my current IPTable rules are given. I found this link and found the thread as solved. So I tried Code:
iptables -t nat -I PREROUTING -m tcp -p tcp -d www.facebook.com --dport 443 -j DROP Code:
iptables v1.4.10: In my area the facebook is having below IP addresses. Code:
nslookup facebook.com Code:
dig facebook.com iptables -t nat -A PREROUTING -i eth0 --dst facebook.com -p tcp --dport 443 -j DNAT --to 192.168.0.1:3128 now it is blocked and gives the error as "secure connection failed" even they tries it multiple times using https://facebook.com which was accessible earlier with the same i.e https://facebook.com but again it can accessible if they use "www" i.e https://www.facebook.com So please help me to block "https://www.facebook.com" |
Make a fake facebook.com dns entry in your DNS server :p
|
Hello,
I guess this might help.I just found this bash script from google.(Facebook IP Range) Code:
iptables -N FACEBOOK |
why not just remap facebook.com to 127.0.0.1 in the host file
that will block all access |
why dont u try 2 block facebook in squid ??
create an acl for facebook.com Code:
acl badsite dstdomain .facebook.com Code:
http_reply_access deny badsite More details: go to http://servercomputing.blogspot.com/...xy-server.html |
Quote:
Code:
cat /etc/bind/db.facebook.com |
Code:
acl badsite dstdomain .facebook.com I don't know what is wrong, but it is not working for me in transparent mode. |
Quote:
I tried it, now the https://www.facebook.com also has been blocked. But it is not enough to do this with only server's /etc/hosts file instead we should do it in each user machine's /etc/hosts file. Then only the facebook redirect to 127.0.0.1 on their PC. But since the every user is having super user permission, they can easily remove the entry from /etc/hosts file. So may I expect some iptable rules from you to block https://www.facebook.com? |
Thanks coolsg5636,
I tried the iptable rules provided by you. Still https://www.facebook.com can accessible but when they login, again by default it redirect to http://www.facebook.com. then they will get proxy error but if they put "s" after http with same url again it works. So I think, I have not completely blocked https://facebook.com if the users are accessing it using "s" whenever they get error. So please see the IP addresses in my area that is already given and help me to block https://www.facebook.com with iptable rules. |
Quote:
just double check it again... |
Thanks for the reply linuxmen,
I checked it many times using Code:
acl badsite dstdomain .facebook.com |
Thank you
hi all
thanks guys to meet you here in linux furum. hope we can share our knowledge. thanks guys. god bless us. |
If you use DNS poisoning (which is what other people suggested by adding a bad DNS entry for facebook.com) then you can simply manually set the DNS to google public DNS or to opendns. That way the computers you want bypass the filtering altogether. The only drawback (maybe) is that you don't get the advantages of caching from your local squid.
|
hi im new in linux programming. i also want to know if blocking of https://facebook.com is possible?
|
Keep in mind that a workaround to bypass bad dns entry are tools and programs like ultrasurf that allow you to use another dns.
|
All times are GMT -5. The time now is 07:56 PM. |