Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I'm curious about this, and being fairly new to Squid it would be helpful for me to clear something up in my head. I just can't quite understand how this would just block denied_users to bad_sites and not deny everyone access to bad_sites?
I'm curious about this, and being fairly new to Squid it would be helpful for me to clear something up in my head. I just can't quite understand how this would just block denied_users to bad_sites and not deny everyone access to bad_sites?
can you check the above squid.conf file.SO that if any modification required please done and send to me
I am sorry I won't be able to do that
I can help you here with troubleshooting but can't do work on your behalf and I believe this will be better for a newbie as well to learn.
Insert the following rules in
Code:
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
acl denied_users src 192.168.1.4
acl bad_sites dstdomain .facebook.com .hi5.com .orkut.com
http_access deny CONNECT bad_sites
http_access allow denied_users
As per this configuration 172.16.1.50 172.16.1.51 won't be allowed to access internet at all as you are denying them do so.
Code:
http_access deny CONNECT denied_users badsite
As I personally tested on my centos 6 workstation. How about you give a try
I did, and it didn't work as intended on my Squid 2.7.STABLE - hence why I asked if it was correct. I only wanted to block 172.16.1.50 & 51 from accessing Facebook - not the entire internet. If I wanted that, surely I'd just do:
Really? At the end of my various other ACL stanza's which explicitly knock out or allow sites, clients and restrict access by time I've just got a default allow rule which works for us:
Code:
http_access allow all
But I guess there is more than one way to achieve the same result, and I've probably got a typo or something when I paste in your example at the top of my acl's for testing.
Last edited by leslie_jones; 03-05-2012 at 01:32 AM.
i just need the purpose of blocking some ip address from access fb and orkut.In future i have to increase ip address and websites in the configuration.
Actually deep27ak, the more I dig into this I realise that my ACL's are not doing entirely what I intended as far as Facebook is concerned. I know that squid won't block HTTPS (for obvious reasons), but I seem to be getting unpredictable blocking on the http version by client IP (or mac). Looks like I've got a job to do this morning.
Glad I read your post or I would never have looked into this. Thanks!
I did, and it didn't work as intended on my Squid 2.7.STABLE - hence why I asked if it was correct. I only wanted to block 172.16.1.50 & 51 from accessing Facebook - not the entire internet. If I wanted that, surely I'd just do:
I'd not need to mention any sites or URL's because I'm blocking the client completely from http.
Cut and paste typo.
Really? At the end of my various other ACL stanza's which explicitly knock out or allow sites, clients and restrict access by time I've just got a default allow rule which works for us:
Code:
http_access allow all
But I guess there is more than one way to achieve the same result, and I've probably got a typo or something when I paste in your example at the top of my acl's for testing.
Initially any how you will have to specify the range of networks in which you want your squid to be working and apart from that if you want individual authentication as per the IP then you will have to specify them
One little gotcha - the period . before .facebook.com. I knew it meant 'match subdomain' but I overlooked that 'www' is a subdomain effectively, so to match www.facebook.com, the . period is needed. Where a user tries to circumvent this with http://facebook.com a second rule is needed, so it is best dealt with using lists:
i just need the purpose of blocking some ip address from access fb and orkut.In future i have to increase ip address and websites in the configuration.
I think you got your answer in reply #25 from leslie_jones
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.