LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Security (https://www.linuxquestions.org/questions/linux-security-4/)
-   -   Squid Problem (https://www.linuxquestions.org/questions/linux-security-4/squid-problem-367240/)

vivekthemind 09-26-2005 01:51 PM

Squid Problem
 
Hi guys ,
i am using redhat 9.0 as a proxy it was working fine but suddenly
evey url takes hello lot of time to resovle . i really have no idea
suddenly how it became like this . earlier it was working fine for 20
users now it cannot take request for single user . i am using squidGuard with squid even though if i disable squidGuard the same thing
happenes.
anybody has any idea regarding this plz help me out
i have done lots of R&D nothing worked out so far ....

unSpawn 09-26-2005 06:55 PM

If you've done some researching into this, then please post a list of what you looked at and relevant parts of logs (or the URI of those). Might help anyone looking into this.

chrisfirestar 09-26-2005 07:49 PM

try clearing and rebuilding the squid cache, perhaps your cache is full. The other possibility is that your DNS server your pointing to is having problems.

if you do a lookup FROM your gateway does it take a while to resolve the DNS Name?

vivekthemind 09-27-2005 06:08 AM

hello friends ,
ya like i have installed twice squid and reconfigured it. few days it works fine then
it becomes slow which generally sound problem with cache even i have removed the cache
using this command "echo " " > /path/to/squid/caches/swap.state " nothing happens
much. (is there anything to remove cache)
i have increase cache_mem size till 150 mb .
If i remove proxy then it works prefectly ... with great speed....

win32sux 09-27-2005 12:01 PM

Quote:

Originally posted by vivekthemind
hello friends ,
ya like i have installed twice squid and reconfigured it. few days it works fine then
it becomes slow which generally sound problem with cache even i have removed the cache
using this command "echo " " > /path/to/squid/caches/swap.state " nothing happens
much. (is there anything to remove cache)
i have increase cache_mem size till 150 mb .
If i remove proxy then it works prefectly ... with great speed....

could you post your squid.conf?? how much RAM does your box have??
Code:

cat /etc/squid/squid.conf | grep -v ^# | grep -v ^$

chrisfirestar 09-27-2005 07:18 PM

if you have webmin installed they will have a squid module that has the function to clean and rebuild. Im not actually sure what the commands are though sorry

vivekthemind 09-28-2005 12:21 AM

hi
My system configuration is
Celeron 1.2 Ghz
512 Mb RAM
20 Gb Harddisk
my squid.conf i am posting over here have look .......

http_port 8080
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 100 MB
redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl snip src 192.99.10.0/24
http_access allow snip
http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all
coredump_dir /var/spool/squid

win32sux 09-28-2005 04:25 PM

Quote:

Originally posted by vivekthemind
hi
My system configuration is
Celeron 1.2 Ghz
512 Mb RAM
20 Gb Harddisk
my squid.conf i am posting over here have look .......

http_port 8080
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 100 MB
redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl snip src 192.99.10.0/24
http_access allow snip
http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all
coredump_dir /var/spool/squid

i think the core of your issue might be that you aren't specifying which replacement policy to use for either your memory and disk caches... also, you seem to be using the (crappy) default size limits, etc... try this instead, i put the part i edited in bold:
Code:

http_port 8080
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
# Make sure the following line points to the proper directory:
cache_dir ufs /var/lib/squid/cache 1024 64 512
cache_replacement_policy heap LFUDA
maximum_object_size 64 MB
cache_mem 64 MB
maximum_object_size_in_memory 16 KB
memory_replacement_policy heap LFUDA

redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp:          1440    20%    10080
refresh_pattern ^gopher:        1440    0%      1440
refresh_pattern .              0      20%    4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80          # http
acl Safe_ports port 21          # ftp
acl Safe_ports port 443 563    # https, snews
acl Safe_ports port 70          # gopher
acl Safe_ports port 210        # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280        # http-mgmt
acl Safe_ports port 488        # gss-http
acl Safe_ports port 591        # filemaker
acl Safe_ports port 777        # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl  snip src 192.99.10.0/24
http_access allow snip
http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all
coredump_dir /var/spool/squid

i brought your cache memory size down to 64MB because it lessens the chance of anything spilling into swap space, while still being a very good amount of RAM for the memory cache... BTW, keep in mind squid doesn't only use memory for the cached objects...

you might also wanna disable logging if you don't need it (to get better performance at high loads):

Code:

#cache_log none
cache_store_log none
cache_access_log none


just my :twocents:...


vivekthemind 09-30-2005 01:58 AM

YEP
u are right my friend recenty i have started using linux i don't know much
about it but i started learning . i have edited file once again with ur suggestion
i din't get one thing could u explain what is last three arguments in below line
cache_dir ufs /var/lib/squid/cache 1024 64 512
one more thing i have put block list in my squidGuard which contains 15 lakh urls
if i use all of them my box becames dam slow . right now i am using it with only 50k
urls which i am bloking .....
do i have to upgrade my system or something else is there ....

and one more thing if i copy any thing from my linux machine to win machine using samba
it takes hello lot of times which was not happening earlier .........any suggestion...

chrisfirestar 09-30-2005 06:16 AM

I find slow SMB transfer can be due to firewall, just as a test disable your firewall and try. If it speeds up you know your problem ;)

I dont know too much about SMB but I know that from my experience it seemed to use multiple ports to do its transfers, so even after I opened all the required ports it would still not work fast. This was a long time ago now so i am not sure, i havent used SMB for a long ime now

win32sux 09-30-2005 01:06 PM

Quote:

Originally posted by vivekthemind
YEP
u are right my friend recenty i have started using linux i don't know much
about it but i started learning .

great!!! you sound like the type that learns fast, though... :)

Quote:

i have edited file once again with ur suggestion
i din't get one thing could u explain what is last three arguments in below line
cache_dir ufs /var/lib/squid/cache 1024 64 512
rememeber to shutdown squid and run "squid -z" to re-create the cache after updating it with my suggestion...

the last three arguments are the size of the disk cache (in MB), the number of first level directories, and the number of second level directories... from squid.conf.default:
Quote:

# cache_dir ufs Directory-Name Mbytes L1 L2 [options]
#
# 'Mbytes' is the amount of disk space (MB) to use under this
# directory. The default is 100 MB. Change this to suit your
# configuration. Do NOT put the size of your disk drive here.
# Instead, if you want Squid to use the entire disk drive,
# subtract 20% and use that value.
#
# 'Level-1' is the number of first-level subdirectories which
# will be created under the 'Directory'. The default is 16.
#
# 'Level-2' is the number of second-level subdirectories which
# will be created under each first-level directory. The default
# is 256.

#Default:
# cache_dir ufs /var/lib/squid/cache 100 16 256[/B]
take alook at your squid.conf.default for more info...

Quote:

one more thing i have put block list in my squidGuard which contains 15 lakh urls
if i use all of them my box becames dam slow . right now i am using it with only 50k
urls which i am bloking .....
do i have to upgrade my system or something else is there ....
i don't fully understand what you mean, sorry... :(

Quote:

and one more thing if i copy any thing from my linux machine to win machine using samba
it takes hello lot of times which was not happening earlier .........any suggestion...
i would suggest searching LQ cuz i'm pretty sure i've seen a few threads about this... if you can't find anything then start a new thread for your samba question, cuz it's not related to this squid issue...

chrisfirestar 09-30-2005 07:02 PM

Quote:

one more thing i have put block list in my squidGuard which contains 15 lakh urls
if i use all of them my box becames dam slow . right now i am using it with only 50k
urls which i am bloking .....
do i have to upgrade my system or something else is there ....
I think he's saying he has a large file containing a list of banned sites. I did this once for blocking ads/banners etc and found it did slow it all down, i guess it MAY be cause every request must go through the list to ensure it allowed

m4dj4ck 10-03-2005 12:05 AM

i dont think it is the squidguard problem. It would be best if you could paste the logs of cache.log file. Maybe you dont have enough file descriptors running on your squid. The default size is 1024. If you build from source, you can specify the number of file descriptors before compile it. In my case here, i run 5120 with about 200 users. Might as well check the access.log file. If the size goes throught 1GB, it will slow down the speed. Just my 2 cents. ;)

vivekthemind 10-03-2005 02:23 AM

Dear friends
first of all i would like to say thanks to all of u
after all this , my squid is running fine only one thing is
remaining that i have to check it for few more days .

i clear my point about squidGuard earlier i was using block list
which was having more then 15 lakh domain addresses and urls.....
when i used to run my squid my hardisk was on its top ... and
squidGuard using my whole cpu ... and after all my box used to get hang
then i reduced blocked list and i brought it down around 50k now its working fine
( some where i read don't use heavy block list that will make ur box slow ...)
anyway now i want to make sure limit my bandwidth
and i heard in squid we have delay pool for it.
any good docs about delay pool or how to implement it ...
thx in advance................

m4dj4ck 10-03-2005 03:23 AM

checkout the squid's user guide.

http://squid-docs.sourceforge.net/la...tml/book1.html

It should be easy to setup if you followed the guide closely.


All times are GMT -5. The time now is 11:30 AM.