Hi,
I've written a cron script to monitor a website for downtime. Essentially it wgets the website, greps it for a known string, and if the string doesn't show up, emails me. It also has a check to make sure it emails me at most once an hour, so my box doesn't fill up:
Code:
#!/bin/sh
rm runlast
date >> runlast
y="The following services on OpenMRS.org are broken! Oh noes! "
z=0
wget demo.openmrs.org -O openmrs1.html
x=`cat openmrs1.html|grep /openmrs/dwr/interface/DWRAlertService.js`
if [ ! -n "$x" ]
then
y=$y"demo.openmrs.org "
z=1
fi
wget openmrs.org -O openmrs2.html
x=`cat openmrs2.html|grep Meeting,Community,Developers,Developers`
if [ ! -n "$x" ]
then
y=$y"openmrs.org "
z=1
fi
wget dev.openmrs.org -O openmrs3.html
x=`cat openmrs3.html|grep /wiki/WikiStart?action=diff`
if [ ! -n "$x" ]
then
y=$y"dev.openmrs.org "
z=1
fi
wget forum.openmrs.org -O openmrs4.html
x=`cat openmrs4.html|grep templates/subSilver/images/openmrs-small-logo.gif`
if [ ! -n "$x" ]
then
y=$y"forum.openmrs.org "
z=1
fi
if test $z -eq 1
then
a=`cat olddate`
b=`perl -le 'print time'`
c=`expr $b - $a`
echo `date`" "$y$c >> openmrslog
if test $c -ge 3600
then
rm olddate
echo $b >> olddate
echo $y | mail me@me.com
fi
fi
The script works when run via ssh, but when run via cron it emails me, but the wgets fail. That is, if the correct files are already there, it will not email me regardless of whether the website is down: it doesn't try to fetch them! Along the same line, if the correct files aren't already there (eg openmrs1.html, openmrs2.html... aren't there), then it will not try to download them, and will just email me. I don't understand this -- why does it work via ssh, but not cron, and, more importantly, how can I fix it?!
My crontab -l output is:
Code:
0,10,20,30,40,50 * * * * /u/1/a/ata2114/openmrs.sh>/dev/null 2>&1
Any help would be appreciated!
Adam