LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   help with script (https://www.linuxquestions.org/questions/linux-newbie-8/help-with-script-621797/)

nightcat 02-17-2008 10:41 AM

help with script
 
looking for a pre written script that I could edit to suite my exact needs. Im trying to keep a log of the other computers on the network(sites visited) and have them emailed to me at the end of the day. Does anyone kow where I can look, Ive searched everywhere, thanks.

pixellany 02-17-2008 11:22 AM

Sites visited with your browser? That will be in the history file.

for logging of more general network traffic, take a look here:
http://www.google.com/search?q=log+n...L_enUS177US235

PS: You might also try searching in the Advanced Bash Scripting Guide (free at http://tldp.org)

bigrigdriver 02-17-2008 11:24 AM

Sounds like you would need the browser history from each browser on each machine. So your script will have to cycle through each machine, and each browser hidden folder for each user on each machine.

nightcat 02-17-2008 11:27 AM

thanks for the help will take a look

Hey Bigrigdriver, I did also think about that issue too which would make the script alot more difficult but I was hoping there was another way. thanks

bigrigdriver 02-17-2008 11:36 AM

There might be an easier way. Assuming the network accesses the internet through only one port in the firewall, you could monitor outbound traffic, watching for URLs, and record them.

nightcat 02-17-2008 03:48 PM

hmmm good idea. I guess I can start there cause thats the exact same setup that I have. Ill worry about seperating the logs later.

btmiller 02-17-2008 04:09 PM

You might also want to look at forcing all Web traffic through a Squid proxy (can be done via firewall rules or simply blocking port 80 outbound except from the proxy machine and requiring all workstations to connect through it). I haven't ever used Squd, but from what I've hear it has pretty good logging capabilities. Relying on browser histories isn't a good idea because they can be turned off or tampered with by the user.

homey 02-17-2008 04:31 PM

We use squid with dansguardian helper which means the client machines are pointed to port 8080 instead of 3128 which squid uses.
Both squid and dansguardian create an access.log which is nice but not very easy to read.
For that reason, I use a perl script to read the access.log and output to html file. Make that executable and run from a crontab ( once and hour for me ).
There is a nice sample at dansguardian add-on section.
I also recommend webmin for thingies like squid and dg. On my fedora box, the paths were incorrect which is easy to fix.

# Also, it needed to have Zlib for Dansguardian logfiles in webmin ...
smart install perl-Compress-Zlib

# I also download a bigger blacklist from
http://squidguard.shalla.de/shallalist.html

# Automate shallalist with this script
shalla_update script

The tampering part is disabled in Active Directory group policy which you can fine tune quite nicely.

To email myself the html file, I use mutt command line as it is nice for attachments ( -a ) and blind carbon copies ( -b )


All times are GMT -5. The time now is 01:33 AM.