Linux - Networking This forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
07-31-2007, 09:07 AM
|
#1
|
Member
Registered: May 2006
Posts: 141
Rep:
|
Squid to block all the sites except 1 or 2 sites
Hi Everyone
I am ubuntu linux and configured squid in that
Actually right now the squid is qllowing all the
sites , But what my need is i need to block all the
sites except one or two sites for ex google and
company website
Anyone helping this issue will be greatly appriciated
|
|
|
07-31-2007, 10:25 AM
|
#2
|
Member
Registered: Sep 2004
Location: Dubai, UAE
Distribution: RHL
Posts: 350
Rep:
|
go get squidGuard n add the source ips and in acl giv access to google n your company website n select none then
welll squidGuard configuration is very much easier if u use it with webmin
good luck
|
|
|
08-01-2007, 08:58 AM
|
#3
|
Member
Registered: May 2006
Posts: 141
Original Poster
Rep:
|
Hi Thanks for reply
But actually i am using only squid
what should i do for getting squid guard
So i cannot able to do this in normal squid ?????
|
|
|
08-02-2007, 02:31 AM
|
#4
|
LQ Newbie
Registered: Jul 2007
Posts: 12
Rep:
|
No. You can not use Squid for content filtering. You have to patch it with something like Squid Guard or Dans Guardian for that.
But there are other content filtering proxies like SafeSquid ( http://freshmeat.net/projects/safesquid/) that can be used as proxy + web cache + content filtering. It also has a GUI interface for management. You can also forward request from SafeSquid to Squid.
|
|
|
08-02-2007, 04:22 AM
|
#5
|
Member
Registered: May 2006
Posts: 141
Original Poster
Rep:
|
How can i install squid squard
And it like addon kind of setup which we need to add with squid
please suggest me out of this
|
|
|
08-02-2007, 04:57 AM
|
#6
|
Moderator
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417
|
who ever asked for content filtering??? you appear to just be advertising your own project, not answering the question at hand.
winxandlinx you can do exactly what you want very very simply with squid. there is no need whatsoever to use other tools in conjunction with it. you just want a dstdomain whitelist and permit everything in that whitelist and then follow up with a blanket deny after it.
acl whitelist dstdomain .google.com .mycompany.com
acl all src 0.0.0.0/0.0.0.0
http_access permit whitelist
http_access deny all
Last edited by acid_kewpie; 08-02-2007 at 04:59 AM.
|
|
|
11-13-2009, 11:35 AM
|
#7
|
LQ Newbie
Registered: Aug 2009
Posts: 1
Rep:
|
Thanks to U
Quote:
Originally Posted by acid_kewpie
who ever asked for content filtering??? you appear to just be advertising your own project, not answering the question at hand.
winxandlinx you can do exactly what you want very very simply with squid. there is no need whatsoever to use other tools in conjunction with it. you just want a dstdomain whitelist and permit everything in that whitelist and then follow up with a blanket deny after it.
acl whitelist dstdomain .google.com .mycompany.com
acl all src 0.0.0.0/0.0.0.0
http_access permit whitelist
http_access deny all
|
This is working. I implement in my network. This also work in RHEL 5.4.
|
|
|
12-17-2009, 04:04 AM
|
#8
|
Member
Registered: Dec 2009
Posts: 36
Rep:
|
Can you do redirection like this with squid also?
I seem to remember you could...
|
|
|
10-27-2010, 03:53 AM
|
#9
|
LQ Newbie
Registered: Oct 2010
Distribution: CentOS, Ubuntu
Posts: 6
Rep:
|
Quote:
Originally Posted by acid_kewpie
acl whitelist dstdomain .google.com .mycompany.com
acl all src 0.0.0.0/0.0.0.0
http_access permit whitelist
http_access deny all
|
yes, but anybody could still visit https://www.facebook and other https sites. how to prevent this? I can't force users to use Proxy's, coz I want all control/management to be serverside only.
|
|
|
All times are GMT -5. The time now is 12:05 PM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|