Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
08-15-2007, 05:19 PM
|
#1
|
Member
Registered: Jun 2001
Distribution: FC4
Posts: 136
Rep:
|
using cron and wget to hit a php script
Hi,
I have a php script in a web page that needs to run every minute. I'd like to use cron and wget with no output; just hit it and die. In all my research, I've come up with the following and put it in /ect/crontab:
* * * * * wget -q --spider http://mysite.com/index.php?option=com_acajoom&act=cron >/dev/null 2>&1
However, even without the options, it doesn't seem to work. Am I missing something or are there better options for hitting it quick and dying with no output?
360
Last edited by 360; 08-15-2007 at 05:23 PM.
|
|
|
08-15-2007, 05:39 PM
|
#2
|
Senior Member
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141
Rep: 
|
Have you tried using the full path to wget? Also, try sending the output to a file instead of /dev/null and see if it gives you some more info.
|
|
|
08-15-2007, 05:51 PM
|
#3
|
Senior Member
Registered: Aug 2003
Location: Glasgow
Distribution: Fedora / Solaris
Posts: 3,109
Rep:
|
Probably a good idea to wrap the URL in 'quotes' as well.
Dave
|
|
|
08-15-2007, 07:08 PM
|
#4
|
Member
Registered: Jun 2001
Distribution: FC4
Posts: 136
Original Poster
Rep:
|
Hi,
Thanks for the tips... I tried them but no-luck.
In doing more research, I found:
--spider
When invoked with this option, Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there. For example, you can use Wget to check your bookmarks:
wget --spider --force-html -i bookmarks.html
This feature needs much more work for Wget to get close to the functionality of real web spiders.
So i took out all the features including dev/null.
I included the direct path and tried it with and without quotes around the url.
* * * * * /usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron"
Should I see any output at the command prompt?
How can I invoke crontab manually to run test?
360pro
|
|
|
08-15-2007, 07:22 PM
|
#5
|
Senior Member
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141
Rep: 
|
You can bypass cron and just run it from a console and see what output you get:
Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron"
|
|
|
08-15-2007, 07:29 PM
|
#6
|
Member
Registered: Jun 2001
Distribution: FC4
Posts: 136
Original Poster
Rep:
|
Quote:
Originally Posted by gilead
You can bypass cron and just run it from a console and see what output you get:
Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron"
|
gilead,
It ran just fine... the php script was invoked and I received the output of the script via email.
Thanks for your time on this matter.
I will continue to find the bug in cron...
360
Last edited by 360; 08-16-2007 at 12:04 AM.
|
|
|
08-15-2007, 10:18 PM
|
#7
|
Senior Member
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141
Rep: 
|
No problem - glad it helped...
|
|
|
08-16-2007, 01:19 AM
|
#8
|
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,430
|
Generally, if cron has a problem it'll email the root acct (or whoever owns the relevant crontab).
Try
mailx
as the crontab owner eg root
Also, as mentioned try redirecting stdout/stderr to a file until you get it working.
|
|
|
08-16-2007, 01:56 PM
|
#9
|
Senior Member
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141
Rep: 
|
Quote:
Originally Posted by 360
I will continue to find the bug in cron...
|
That'll teach me to read more carefully...
As chrism01 posted, cron should mail you if it has a problem. If that isn't happening for whatever reason, you can try and force it with this (change the path to mail or mailx and the recipient name for your system):
Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron" | /usr/bin/mail -s "Output of com_acajoom" root
|
|
|
06-20-2012, 04:03 PM
|
#10
|
LQ Newbie
Registered: Jun 2012
Posts: 1
Rep: 
|
Solution
I was having the same problem as the person who started this thread. After trying about five million combinations, I found that
wget -q -O /dev/null "[your url here]" produces no output at all including the page itself. --spider wasn't working properly, or at least according to what I would expect.
|
|
|
All times are GMT -5. The time now is 03:25 PM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|