LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 10-14-2008, 06:31 AM   #1
ericcarlson
Member
 
Registered: Jan 2002
Posts: 162

Rep: Reputation: 30
Using wget to test if a website is up


I have a site which every now and again locks up. I know why - the server has too little memory but the upgrade isn't till next year now. When it locks I can log in and "service restart httpd" and alls well for a few more hours/days whatever. What I want is to automate this with a 10 minute cron job, so it would wget http://www.example.com/index.php and if it took more than 30 secs to return then restart the httpd service.

I looked and wget and cant see how to tell from a bash script if it failed because of the timeout. I know most times it will not fail, so I just want it to throw away whatever came back. Anyone got any tips please/a better way to do it? Thanks.
 
Old 10-14-2008, 08:17 AM   #2
clvic
Member
 
Registered: Feb 2008
Location: Rome, Italy
Distribution: OpenSuSE 11.x, vectorlinux, slax, Sabayon
Posts: 206
Blog Entries: 2

Rep: Reputation: 45
Other than using wget, I'll go a bit off-topic and suggest you to use a software such as monit (http://tildeslash.com/monit/) that is easy to use but at the same time much very flexible and will likely accommodate also future needs.
 
Old 10-14-2008, 08:23 AM   #3
David1357
Senior Member
 
Registered: Aug 2007
Location: South Carolina, U.S.A.
Distribution: Ubuntu, Fedora Core, Red Hat, SUSE, Gentoo, DSL, coLinux, uClinux
Posts: 1,302
Blog Entries: 1

Rep: Reputation: 107Reputation: 107
Quote:
Originally Posted by ericcarlson View Post
I looked and wget and cant see how to tell from a bash script if it failed because of the timeout. I know most times it will not fail, so I just want it to throw away whatever came back. Anyone got any tips please/a better way to do it? Thanks.
If you use syntax similar to the following
Code:
[machine:~]:wget --timeout=0.1 --tries=1 http://www.linuxquestions.org/questions/linux-software-2/using-wget-to-test-if-a-website-is-up-676231/
--09:20:57--  http://www.linuxquestions.org/questions/linux-software-2/using-wget-to-test-if-a-website-is-up-676231/
           => `index.html.1'
Resolving www.linuxquestions.org... failed: Connection timed out.
[machine:~]:echo "$?"
1
but change timeout to something reasonable, you can parse the result for the failure reason (i.e. "Connection timed out").
 
Old 10-14-2008, 08:36 AM   #4
Vit77
Member
 
Registered: Jun 2008
Location: Toronto, Canada
Distribution: SuSE, RHEL, Mageia
Posts: 132

Rep: Reputation: 17
If you use stand-alone httpd server, may be it would be more reasonable to use nmap for checking if the port alive?
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
wget to login to a website uks Linux - General 6 11-01-2012 04:44 AM
wget through an authenticated website davee Linux - Networking 2 02-11-2009 04:57 AM
Problems mirroring an asp website with wget el_pajaro! Linux - General 1 02-16-2008 11:19 PM
using wget to download all videos from a website fakie_flip Linux - Software 3 08-16-2006 05:02 AM
wget - retrieving one folder of website davidhayter Linux - Software 4 03-07-2005 08:20 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 05:36 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration