LinuxQuestions.org
Register a domain and help support LQ
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices



Reply
 
Search this Thread
Old 09-26-2005, 02:51 PM   #1
vivekthemind
LQ Newbie
 
Registered: Sep 2005
Location: Bangalore, India
Posts: 12

Rep: Reputation: 0
Squid Problem


Hi guys ,
i am using redhat 9.0 as a proxy it was working fine but suddenly
evey url takes hello lot of time to resovle . i really have no idea
suddenly how it became like this . earlier it was working fine for 20
users now it cannot take request for single user . i am using squidGuard with squid even though if i disable squidGuard the same thing
happenes.
anybody has any idea regarding this plz help me out
i have done lots of R&D nothing worked out so far ....
 
Old 09-26-2005, 07:55 PM   #2
unSpawn
Moderator
 
Registered: May 2001
Posts: 27,779
Blog Entries: 54

Rep: Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978Reputation: 2978
If you've done some researching into this, then please post a list of what you looked at and relevant parts of logs (or the URI of those). Might help anyone looking into this.
 
Old 09-26-2005, 08:49 PM   #3
chrisfirestar
Member
 
Registered: Sep 2003
Location: Adelaide, Australia
Distribution: Fedora/RH
Posts: 231

Rep: Reputation: 30
try clearing and rebuilding the squid cache, perhaps your cache is full. The other possibility is that your DNS server your pointing to is having problems.

if you do a lookup FROM your gateway does it take a while to resolve the DNS Name?
 
Old 09-27-2005, 07:08 AM   #4
vivekthemind
LQ Newbie
 
Registered: Sep 2005
Location: Bangalore, India
Posts: 12

Original Poster
Rep: Reputation: 0
hello friends ,
ya like i have installed twice squid and reconfigured it. few days it works fine then
it becomes slow which generally sound problem with cache even i have removed the cache
using this command "echo " " > /path/to/squid/caches/swap.state " nothing happens
much. (is there anything to remove cache)
i have increase cache_mem size till 150 mb .
If i remove proxy then it works prefectly ... with great speed....
 
Old 09-27-2005, 01:01 PM   #5
win32sux
Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 371Reputation: 371Reputation: 371Reputation: 371
Quote:
Originally posted by vivekthemind
hello friends ,
ya like i have installed twice squid and reconfigured it. few days it works fine then
it becomes slow which generally sound problem with cache even i have removed the cache
using this command "echo " " > /path/to/squid/caches/swap.state " nothing happens
much. (is there anything to remove cache)
i have increase cache_mem size till 150 mb .
If i remove proxy then it works prefectly ... with great speed....
could you post your squid.conf?? how much RAM does your box have??
Code:
cat /etc/squid/squid.conf | grep -v ^# | grep -v ^$
 
Old 09-27-2005, 08:18 PM   #6
chrisfirestar
Member
 
Registered: Sep 2003
Location: Adelaide, Australia
Distribution: Fedora/RH
Posts: 231

Rep: Reputation: 30
if you have webmin installed they will have a squid module that has the function to clean and rebuild. Im not actually sure what the commands are though sorry
 
Old 09-28-2005, 01:21 AM   #7
vivekthemind
LQ Newbie
 
Registered: Sep 2005
Location: Bangalore, India
Posts: 12

Original Poster
Rep: Reputation: 0
hi
My system configuration is
Celeron 1.2 Ghz
512 Mb RAM
20 Gb Harddisk
my squid.conf i am posting over here have look .......

http_port 8080
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 100 MB
redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl snip src 192.99.10.0/24
http_access allow snip
http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all
coredump_dir /var/spool/squid
 
Old 09-28-2005, 05:25 PM   #8
win32sux
Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 371Reputation: 371Reputation: 371Reputation: 371
Quote:
Originally posted by vivekthemind
hi
My system configuration is
Celeron 1.2 Ghz
512 Mb RAM
20 Gb Harddisk
my squid.conf i am posting over here have look .......

http_port 8080
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 100 MB
redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl snip src 192.99.10.0/24
http_access allow snip
http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all
coredump_dir /var/spool/squid
i think the core of your issue might be that you aren't specifying which replacement policy to use for either your memory and disk caches... also, you seem to be using the (crappy) default size limits, etc... try this instead, i put the part i edited in bold:
Code:
http_port 8080
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
# Make sure the following line points to the proper directory:
cache_dir ufs /var/lib/squid/cache 1024 64 512
cache_replacement_policy heap LFUDA
maximum_object_size 64 MB
cache_mem 64 MB
maximum_object_size_in_memory 16 KB
memory_replacement_policy heap LFUDA
redirect_program /usr/bin/squidGuard -c /etc/squid/squidGuard.conf
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
refresh_pattern ^ftp:           1440    20%     10080
refresh_pattern ^gopher:        1440    0%      1440
refresh_pattern .               0       20%     4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80          # http
acl Safe_ports port 21          # ftp
acl Safe_ports port 443 563     # https, snews
acl Safe_ports port 70          # gopher
acl Safe_ports port 210         # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280         # http-mgmt
acl Safe_ports port 488         # gss-http
acl Safe_ports port 591         # filemaker
acl Safe_ports port 777         # multiling http
acl CONNECT method CONNECT
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
acl  snip src 192.99.10.0/24
http_access allow snip
http_access allow localhost
http_access deny all
http_reply_access allow all
icp_access allow all
coredump_dir /var/spool/squid
i brought your cache memory size down to 64MB because it lessens the chance of anything spilling into swap space, while still being a very good amount of RAM for the memory cache... BTW, keep in mind squid doesn't only use memory for the cached objects...

you might also wanna disable logging if you don't need it (to get better performance at high loads):

Code:
#cache_log none
cache_store_log none
cache_access_log none

just my ...


Last edited by win32sux; 09-28-2005 at 11:01 PM.
 
Old 09-30-2005, 02:58 AM   #9
vivekthemind
LQ Newbie
 
Registered: Sep 2005
Location: Bangalore, India
Posts: 12

Original Poster
Rep: Reputation: 0
YEP
u are right my friend recenty i have started using linux i don't know much
about it but i started learning . i have edited file once again with ur suggestion
i din't get one thing could u explain what is last three arguments in below line
cache_dir ufs /var/lib/squid/cache 1024 64 512
one more thing i have put block list in my squidGuard which contains 15 lakh urls
if i use all of them my box becames dam slow . right now i am using it with only 50k
urls which i am bloking .....
do i have to upgrade my system or something else is there ....

and one more thing if i copy any thing from my linux machine to win machine using samba
it takes hello lot of times which was not happening earlier .........any suggestion...
 
Old 09-30-2005, 07:16 AM   #10
chrisfirestar
Member
 
Registered: Sep 2003
Location: Adelaide, Australia
Distribution: Fedora/RH
Posts: 231

Rep: Reputation: 30
I find slow SMB transfer can be due to firewall, just as a test disable your firewall and try. If it speeds up you know your problem

I dont know too much about SMB but I know that from my experience it seemed to use multiple ports to do its transfers, so even after I opened all the required ports it would still not work fast. This was a long time ago now so i am not sure, i havent used SMB for a long ime now
 
Old 09-30-2005, 02:06 PM   #11
win32sux
Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 371Reputation: 371Reputation: 371Reputation: 371
Quote:
Originally posted by vivekthemind
YEP
u are right my friend recenty i have started using linux i don't know much
about it but i started learning .
great!!! you sound like the type that learns fast, though...

Quote:
i have edited file once again with ur suggestion
i din't get one thing could u explain what is last three arguments in below line
cache_dir ufs /var/lib/squid/cache 1024 64 512
rememeber to shutdown squid and run "squid -z" to re-create the cache after updating it with my suggestion...

the last three arguments are the size of the disk cache (in MB), the number of first level directories, and the number of second level directories... from squid.conf.default:
Quote:
# cache_dir ufs Directory-Name Mbytes L1 L2 [options]
#
# 'Mbytes' is the amount of disk space (MB) to use under this
# directory. The default is 100 MB. Change this to suit your
# configuration. Do NOT put the size of your disk drive here.
# Instead, if you want Squid to use the entire disk drive,
# subtract 20% and use that value.
#
# 'Level-1' is the number of first-level subdirectories which
# will be created under the 'Directory'. The default is 16.
#
# 'Level-2' is the number of second-level subdirectories which
# will be created under each first-level directory. The default
# is 256.

#Default:
# cache_dir ufs /var/lib/squid/cache 100 16 256[/B]
take alook at your squid.conf.default for more info...

Quote:
one more thing i have put block list in my squidGuard which contains 15 lakh urls
if i use all of them my box becames dam slow . right now i am using it with only 50k
urls which i am bloking .....
do i have to upgrade my system or something else is there ....
i don't fully understand what you mean, sorry...

Quote:
and one more thing if i copy any thing from my linux machine to win machine using samba
it takes hello lot of times which was not happening earlier .........any suggestion...
i would suggest searching LQ cuz i'm pretty sure i've seen a few threads about this... if you can't find anything then start a new thread for your samba question, cuz it's not related to this squid issue...

Last edited by win32sux; 09-30-2005 at 02:17 PM.
 
Old 09-30-2005, 08:02 PM   #12
chrisfirestar
Member
 
Registered: Sep 2003
Location: Adelaide, Australia
Distribution: Fedora/RH
Posts: 231

Rep: Reputation: 30
Quote:
one more thing i have put block list in my squidGuard which contains 15 lakh urls
if i use all of them my box becames dam slow . right now i am using it with only 50k
urls which i am bloking .....
do i have to upgrade my system or something else is there ....
I think he's saying he has a large file containing a list of banned sites. I did this once for blocking ads/banners etc and found it did slow it all down, i guess it MAY be cause every request must go through the list to ensure it allowed
 
Old 10-03-2005, 01:05 AM   #13
m4dj4ck
Member
 
Registered: Aug 2004
Location: the coven
Distribution: slackies
Posts: 55

Rep: Reputation: 15
i dont think it is the squidguard problem. It would be best if you could paste the logs of cache.log file. Maybe you dont have enough file descriptors running on your squid. The default size is 1024. If you build from source, you can specify the number of file descriptors before compile it. In my case here, i run 5120 with about 200 users. Might as well check the access.log file. If the size goes throught 1GB, it will slow down the speed. Just my 2 cents.
 
Old 10-03-2005, 03:23 AM   #14
vivekthemind
LQ Newbie
 
Registered: Sep 2005
Location: Bangalore, India
Posts: 12

Original Poster
Rep: Reputation: 0
Dear friends
first of all i would like to say thanks to all of u
after all this , my squid is running fine only one thing is
remaining that i have to check it for few more days .

i clear my point about squidGuard earlier i was using block list
which was having more then 15 lakh domain addresses and urls.....
when i used to run my squid my hardisk was on its top ... and
squidGuard using my whole cpu ... and after all my box used to get hang
then i reduced blocked list and i brought it down around 50k now its working fine
( some where i read don't use heavy block list that will make ur box slow ...)
anyway now i want to make sure limit my bandwidth
and i heard in squid we have delay pool for it.
any good docs about delay pool or how to implement it ...
thx in advance................
 
Old 10-03-2005, 04:23 AM   #15
m4dj4ck
Member
 
Registered: Aug 2004
Location: the coven
Distribution: slackies
Posts: 55

Rep: Reputation: 15
checkout the squid's user guide.

http://squid-docs.sourceforge.net/la...tml/book1.html

It should be easy to setup if you followed the guide closely.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Squid Problem ajkannan83 Linux - Software 2 10-21-2005 10:48 AM
problem in squid alvi2 Linux - Networking 2 03-03-2005 07:39 AM
squid problem hariiyer Linux - Networking 6 02-14-2005 07:27 AM
squid problem hariiyer Linux - Networking 7 01-31-2005 12:20 PM
squid problem mchitrakar Linux - Networking 4 06-07-2004 01:39 AM


All times are GMT -5. The time now is 04:46 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration