Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
is there any way i can check all visited webpages list under some file. if any one want me to put squid.conf file please let me know.
Note : One strange point, When try to delete old cache from 'webmin'. it's took long long time. all most it's hang. i have to cancel it manually. so is there any way i can clear old cache from command line.(console)
when i try to start & stop services it showing me FAIL. when i am not connect using dialup. once i connected and try to start & stop the services it's showing OK. what could be the problem?
Please help me...
-Hitesh
Last edited by hitesh_linux; 09-16-2004 at 11:43 PM.
i dont know online monitoring for squid. but i know a log analyzer called sarg. it reads squid "access.log" and generates a web page. so if u have apache, u can configure it to publish squid logs. so u will be able to see detailed information such as who surfed where and get how much data from net... etc. http://sarg.sourceforge.net
squid stores cache in /var/spool/squid/ by defult on redhat like distros. there can be lots of directory and subdirectories. it depends what is said in squid.conf about "cache_dir"
check /var/log/messages when squid fails to restart. maybe it is an issue about dns. because when the internet connection is offline, squid cannot reach your ISP dns server. it is just an idea
your second question regarding start/stop
when you are not connected you probably configured you squid to startup with dnslookup, by disabling this your problem would be solved, I duno what distro you are using and how you starting squid
if you are on rh + fedora and you had installed squid from .rpm then im sorry coz i always preferred to installed squid from source.
however on rh, fc you may do this
1) chkconfig squid off
and then in rc.local file put a line
path/squid/sbin/squid -D
replace path with the path where squid binaries are installed -D will tell squid to start without doing dns test on startup.
3) why you frequently deletes your cache strange huh?
squid is alll about cachig and you periodically "delete" your cache why? webmin hangs coz cache directory kinda large containing thoursands of file and it definity takes sometime to deletes are those files
I have tried sarg, i have downloaded from site and try to configure it. here it is.....
Configuring sarg:
--------------------
SARG: (language) Cannot open language file: /usr/local/sarg/languages/English
[root@mail sarg-1.4.1]# ./config.status
creating Makefile
[root@mail sarg-1.4.1]# ./configure
loading cache ./config.cache
checking host system type... i686-pc-linux-gnu
checking for gcc... (cached) gcc
checking whether the C compiler (gcc -g) works... yes
checking whether the C compiler (gcc -g) is a cross-compiler... no
checking whether we are using GNU C... (cached) yes
checking whether gcc accepts -g... (cached) yes
checking for dirent.h that defines DIR... (cached) yes
checking for opendir in -ldir... (cached) no
checking how to run the C preprocessor... (cached) gcc -E
checking for ANSI C header files... (cached) yes
checking for stdio.h... (cached) yes
checking for stdlib.h... (cached) yes
checking for string.h... (cached) yes
checking for strings.h... (cached) yes
checking for sys/time.h... (cached) yes
checking for time.h... (cached) yes
checking for unistd.h... (cached) yes
checking for sys/dirent.h... (cached) no
checking for dirent.h... (cached) yes
checking for sys/socket.h... (cached) yes
checking for netdb.h... (cached) yes
checking for arpa/inet.h... (cached) yes
checking for sys/types.h... (cached) yes
checking for netinet/in.h... (cached) yes
checking for sys/stat.h... (cached) yes
checking for ctype.h... (cached) yes
checking for working const... (cached) yes
checking whether struct tm is in sys/time.h or time.h... (cached) time.h
using /usr/bin as the SARG binary
using /usr/local/man/man1 as the SARG man page
using /usr/local/sarg as the SARG configuration dir
creating ./config.status
creating Makefile
[root@mail sarg-1.4.1]#
Make:
--------
[root@mail sarg-1.4.1]# make
make: Nothing to be done for `all'.
[root@mail sarg-1.4.1]# make install
creating /usr/local/man/man1
mkdir: cannot create directory `/usr/local/man/man1': No such file or directory
make: [install] Error 1 (ignored)
cp sarg /usr/bin/sarg
chmod 755 /usr/bin/sarg
cp sarg.1 /usr/local/man/man1/sarg.1
cp: cannot create regular file `/usr/local/man/man1/sarg.1': No such file or directory
make: *** [install] Error 1
But when i click on any one of directory i got following error.
Object not found!
The requested URL was not found on this server. The link on the referring page seems to be wrong or outdated. Please inform the author of that page about the error.
If you think this is a server error, please contact the webmaster
Error 404
mail.hylix.com
Wed 22 Sep 2004 08:02:41 AM IST
Apache/2.0.40 (Red Hat Linux)
And when it type 'sarg' at console. it show's following messages.
[root@mail root]# sarg
SARG: No records found
SARG: End
[root@mail root]#
Now, what could be the problem? shell i have to do some configure under sarg.conf file? if so, what should i make changes?
make sure that apache is working and publish squidlogs..
if u want apache to ask password to allow squid reports:
httpd.conf :
<Directory "/usr/local/apache2/htdocs/squid-reports">
deny from all
AllowOverride AuthConfig
Order deny,allow
</Directory>
and here is my .htaccess file in /usr/local/apache2/htdocs/squid-reports directory
# cat /usr/local/apache2/htdocs/squid-reports/.htaccess
AuthType Basic
AuthUserFile /usr/local/apache2/conf/.htpasswd
AuthName Squid-logs
require valid-user
satisfy any
also u need a squid.domain.com record in your DNS server. it must point to ip of your apache server. if not, try ip of apache server instead of http://squid.domain.com
good luck
yes.it took some time to clear the cache even if u clear it from webmin or console or anywhere. if you want faster file caching or cache-clearing, u should use reiserfs filesystem as it is much more specialized in small files accessing and deleting.
access_log /var/log/squid/access.log [this option if uncommented]
about <output_dir /usr/local/apache2/htdocs/squid-reports > line, it showing me this:
# TAG: output_dir
# The reports will be saved in that directory
# sarg -o dir
#
#output_dir /var/www/html/squid-reports
output_dir /var/www/sarg/ONE-SHOT
so, i tried in web browser with <mail.hylix.com/sarg/index.html> it showing, but when i clicked any html link it showing same error. here it is..
------------------------------------------------------------------------------------------------
Squid User's Access Report
DIRECTORY DESCRIPTION
ONE-SHOT One shot reports
daily Daily reports
weekly Weekly reports
monthly Monthly reports
Not Found
The requested URL /sarg/ONE-SHOT/index.html was not found on this server.
---------------------------------------------------------------------------------------------------
and when check under /usr/local path, <apache2> direcotry does not exists. what should i do now? [/usr/local/apache2/htdocs/squid-reports]
forget to tell u , I am using fedora core 1....
please help me...
-/Hitesh
Last edited by hitesh_linux; 10-08-2004 at 12:02 AM.
i use /usr/local/apache2/htdocs/squid-reports, because i have installed apache2 from source. and apache2 was installed there by the defualt u can go with /var/www/sarg
i forgot to tell u. sarg must be run first, so it will create web pages from squid access logs. then u can check them trou browser. i added a cron job so sarg runs everyday and create web pages. i can check the logs by day
as i told b4, sarg is not online monitoring tool. when u run sarg , it reads squid access.log and creates cool web pages to analyze logs. then sarg ends. u can add it as cron job. so it will generate web pages at the time u specified.
sarg doesnt work as a service. it just works as a program.
Quote:
[root@mail squid]# /usr/bin/sarg
SARG: No records found
SARG: End
it seems that access.log of squid is empty. check the /var/log/squid/access.log
u can read sarg man page. make sure that u defined where the access.log located, in sarg.conf:
access_log /path/to/squid/access.log
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.