Linux - SecurityThis forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have been working at enabling a timed acl (allow users to surf only during certain times) and allowing only certain sites (whitelist) I cannot get either working...
My squid.conf file is...
Code:
http_port 127.0.0.1:3128 transparent
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
access_log /var/log/squid/access.log
debug_options ALL,1
#debug_options ALL,1,33,2
dns_nameservers 208.67.222.222 208.67.220.220
hosts_file /etc/hosts
auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid/basic.passwd
auth_param basic realm Q3AIT Proxy Authentication
auth_param basic children 10
auth_param basic credentialsttl 3 hours
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
# added by knichel 20090927 to require authentication
acl myAuth proxy_auth REQUIRED
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 # https
acl SSL_ports port 563 # snews
acl SSL_ports port 873 # rsync
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Safe_ports port 631 # cups
acl Safe_ports port 873 # rsync
acl Safe_ports port 901 # SWAT
acl purge method PURGE
acl CONNECT method CONNECT
acl our_nets src 192.168.6.0/24
# added by knichel 20091002 Trying to get more restrictive
acl AM_safe_times time MTWHF 08:00-11:00
acl PM_safe_times time MTWHF 11:45-14:00
acl good_sites dstdomain "/etc/squid/good_sites.txt"
acl bad_sites dstdomain "/etc/squid/bad_sites.txt"
acl GMail browser google.com/a
http_access deny manager
http_access allow purge localhost
http_access deny !Safe_ports
http_access deny purge
http_access deny CONNECT !SSL_ports
http_access allow myAuth good_sites
http_access deny all
http_reply_access allow all
icp_access allow all
cache_effective_group proxy
visible_hostname localhost
coredump_dir /var/spool/squid
and my /etc/squid/good_sites.txt working fine.
I have a successful whitelist working. However, I use google apps and want to allow mail.google.com/a/q3ait.org while blocking mail.google.com (regular gmail).
I am currently catching google.com/mail with dansguardian.
How do I block google.com/mail wile allowing google.com/a/q3ait.org?
Can someone help?
From the testing I did, it seems that mail.google.com is a forwarder to www.google.com/mail. So if someone entered www.google.com/mail, they would get through if I blocked mail.google.com.
Are you sure that the https regex won't work? IF I use this in a whitelist (which I am) then if it didn't work, then they wouldn't be able to get to mail via https, but they can.
From the testing I did, it seems that mail.google.com is a forwarder to www.google.com/mail. So if someone entered www.google.com/mail, they would get through if I blocked mail.google.com.
FWIW, my tests seem to confirm that Gmail won't work if access to mail.google.com is denied.
Quote:
Are you sure that the https regex won't work? IF I use this in a whitelist (which I am) then if it didn't work, then they wouldn't be able to get to mail via https, but they can.
For HTTPS addresses, you can only do regex matching for the domain part. Everything after the port number is inside the SSL connection, so Squid can't see it. I do realize I'm contradicting myself with some some erroneous advice I previously gave you, though (I will add a note to that thread to make this clear). My apologies for that, I'm not sure what my state of mind was on that day. That said, I do believe you can reach your present goal by simply blocking access to mail.google.com.
EDIT: I just reread the OP and it seems the gist of it is you want to allow mail.google.com/a/q3ait.org while denying Gmail. In that case, simply blocking mail.google.com wouldn't be an option. I can't think of any suggestions right now.
I'll definitely be looking at this again later today.
I've been thinking about your quandary and I'm unable to come up with a solution. I'm still convinced there's no way you could (without a MITM attack) use Squid to treat URLs on the same SSL host differently. Assuming I'm right, you should start looking for workarounds instead. Maybe you could deny the CONNECT method for mail.google.com, forcing all communication with that host to take place via HTTP. This way you could use regular expressions to your heart's content.
Since it's likely obligatory for the login phase to take place over SSL, though, this would only work if the login is done with a host other than mail.google.com. Additionally, even if this would technically work, I'm not sure that having only the login phase be encrypted would be acceptable to you (FWIW, it wouldn't be acceptable to me).
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.