ericcarlson |
10-14-2008 06:31 AM |
Using wget to test if a website is up
I have a site which every now and again locks up. I know why - the server has too little memory but the upgrade isn't till next year now. When it locks I can log in and "service restart httpd" and alls well for a few more hours/days whatever. What I want is to automate this with a 10 minute cron job, so it would wget http://www.example.com/index.php and if it took more than 30 secs to return then restart the httpd service.
I looked and wget and cant see how to tell from a bash script if it failed because of the timeout. I know most times it will not fail, so I just want it to throw away whatever came back. Anyone got any tips please/a better way to do it? Thanks.
|