LinuxQuestions.org
Review your favorite Linux distribution.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices

Reply
 
Search this Thread
Old 04-18-2008, 12:16 AM   #1
cooljai
Member
 
Registered: May 2007
Location: /dev/random
Distribution: CentOS, Fedora, RHEL, SuSE
Posts: 62

Rep: Reputation: 15
Question Squid: one site too slow, cache problem also


Dear members,

We have two squid servers (2.5.STABLE12), first on RHEL 4 and other on FreeBSD 6.1

Users are connected to Squid Servers, and Squid Servers are connected to Gateway.

My problem is that our own website.. say mysite.com is coming very slow, every user is complaining.

If we try to access our site by changing our PCs gateway to bypass squid, then its working very fine.

We also needs to block any caching of our website. No page/images should be cached of our website at all because its a dynamic site and we need to view latest contents quickly after uploads.

Here is my squid.conf (from bsd squid, though both are identical), I request you to please have a look and advice:
Code:
http_port 3128
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
cache_mem 512 MB
cache_dir ufs /usr/local/squid/cache 15360 16 256
cache_access_log /usr/local/squid/logs/access.log
cache_log /dev/null
cache_store_log none
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
auth_param basic casesensitive off
refresh_pattern ^ftp:           1440    20%     10080
refresh_pattern ^gopher:        1440    0%      1440
refresh_pattern .               0       20%     4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl CONNECT method CONNECT
acl ournet src 192.168.0.0/255.255.0.0
acl except src 192.168.50.199/255.255.255.255
acl vpn src 10.0.0.0/255.0.0.0
acl block_url url_regex "/etc/sites.txt"
acl nocachedomains dstdomain .mysite.com
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow except
http_access deny ournet block_url
http_access allow ournet
http_access allow vpn
http_access deny all
always_direct allow nocachedomains
no_cache deny nocachedomains
http_reply_access allow all
icp_access allow all
httpd_accel_port 80
httpd_accel_host virtual
httpd_accel_with_proxy on
httpd_accel_uses_host_header on
logfile_rotate 7
coredump_dir /usr/local/squid/cache
 
Old 04-18-2008, 04:39 AM   #2
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 3,901

Rep: Reputation: 775Reputation: 775Reputation: 775Reputation: 775Reputation: 775Reputation: 775Reputation: 775
Quote:
My problem is that our own website.. say mysite.com is coming very slow, every user is complaining.
Sorry, I am unclear: Are you saying that this is a problem that only happens on your internal site, or many sites, one of which happens to be hosted/owned by you?

If it is internal-site-specific, it probably a nameserving problem and nameserving is incorrectly configured on the squid machines (e.g., nameserving is configured correctly somewhere else in the system, but the machines with squid are configured to use a nameservice which is not present and then the squid machines wait until the 'bad' nameservice times out before using the 'good' nameservice).

Quote:
We also needs to block any caching of our website. No page/images should be cached of our website at all because its a dynamic site and we need to view latest contents quickly after uploads.
Really, you don't. If it is your own site and you have control of the code, the HTML can set appropriate sections as not to be cached. This is the correct way of handling this problem.

If you need a work-around, because for some reason you can't/won't write the code correctly, you can probably force squid not to cache specific sites, you can set 'no proxy'in the browser for specific sites or you can force refreshes from the browser. But, as I say, these are work-arounds for not doing the coding correctly and are not the preferred solution (IMHO).
 
Old 04-18-2008, 09:10 AM   #3
cooljai
Member
 
Registered: May 2007
Location: /dev/random
Distribution: CentOS, Fedora, RHEL, SuSE
Posts: 62

Original Poster
Rep: Reputation: 15
Thanks a lot for your kind reply.

The site is hosted somewhere else and what users are complaining repeatedly is that they uploaded the new contents but their browser is displaying old one (probably from cache) and sometimes it display fine. I need a solution to be sure that squid should get latest/updated contents or do not cache this particular website at all.

I have added new acl (as per config written in my first post) and inserted a new "no_cache deny acl" to avoid caching the site. I can also see from squid access logs that all hits of that site are "DIRECT" but still users sometimes see old contents (but always fine if we bypass squid).

About Delay, it might be DNS issue. Can you plz suggest that what is a good solution for this? should I start DNS service in squid server itself OR point it to nearest DNS server?

Thanks in advance...
 
Old 04-21-2008, 03:10 AM   #4
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 3,901

Rep: Reputation: 775Reputation: 775Reputation: 775Reputation: 775Reputation: 775Reputation: 775Reputation: 775
Quote:
Originally Posted by cooljai View Post
The site is hosted somewhere else and what users are complaining repeatedly is that they uploaded the new contents but their browser is displaying old one (probably from cache) and sometimes it display fine. I need a solution to be sure that squid should get latest/updated contents or do not cache this particular website at all.
For most browsers, there is a way to force a reload the page (^R (control button and R at the same time), for example). In this case you should get the 'clean', uncached, page. Does this help your users?

As I previously commented, you can write html to instruct the browser and squid not to cache the page, so with corectly written html this shouldn't be an issue. In my experience, squid, at least, respects these no caching instructions. But that's only with correctly written html.... And it may be that some browsers have behaviour that violates the html standards. I have no real experience with this specific aspect of how browsers perform, but, somehow, when I was writing that the word "Microsoft" kept coming into my mind.

I suppose once you get a reputation as an abuser of standards, people may jump to the conclusion that its is your fault (just because it usually is)... Ho, hum.

Quote:
I have added new acl (as per config written in my first post) and inserted a new "no_cache deny acl" to avoid caching the site. I can also see from squid access logs that all hits of that site are "DIRECT" but still users sometimes see old contents (but always fine if we bypass squid).
That's a bit of mystery to me. I was expecting it still to be 'bad' some of the time because I was expecting the remaining problems to be down to browser caching, so I was expecting occasional problems even with squid bypassed.

Quote:
About Delay, it might be DNS issue. Can you plz suggest that what is a good solution for this? should I start DNS service in squid server itself OR point it to nearest DNS server?
More or less all I know about squid, I know from squid.conf. From what I've figured out from there, you can't directly control where squid gets name lookups from; it uses the standard system set up for that, so you have to configure standard networking things (although there is an implication that squid can cache name lookups that it gets from the standard services - whether this is better or worse than a local dedicated name cache is unclear to me).

So you need to use the standard tools (dig primarily) to look at your nameserving performance and you need to look at your host file and nsswitch to check that you don't have the order of usage of nameservices wrong (the most usual error seems to be to have a non-existant or unconfigured service first in the list; it then tries to use this services; it times out when that services doesn't respond and only then uses the 'good' services - as you can imagine, this really slows every name look up).

As 'internal' (to your own network) hosts are often dealt with differently from hosts out on the wider internet, that might be a pointer to where a misconfiguration might lie, particularly if this site was once hosted internally and has been moved.
 
Old 08-25-2009, 04:04 AM   #5
liviun
LQ Newbie
 
Registered: Nov 2008
Posts: 1

Rep: Reputation: 0
put internal ip

So to solve the problem: Enter your local (internal) IP adress at the http_port option in your squid.conf, like this:

Code:

http_port 192.168.1.1:3128
 
  


Reply

Tags
access, cache, squid


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
squid cache problem jyotix175 Linux - Software 1 02-24-2007 05:59 AM
problem in squid cache alvi2 Linux - Networking 1 06-26-2005 01:07 PM
squid cache problem hasnain Linux - Networking 1 01-18-2005 02:17 PM
Squid Cache Site Exceptions?? win32sux Linux - Software 5 12-13-2004 08:28 AM
Squid cache problem mchanea Linux - Networking 2 05-05-2004 08:15 AM


All times are GMT -5. The time now is 01:45 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration