using cron and wget to hit a php script
Hi,
I have a php script in a web page that needs to run every minute. I'd like to use cron and wget with no output; just hit it and die. In all my research, I've come up with the following and put it in /ect/crontab: * * * * * wget -q --spider http://mysite.com/index.php?option=com_acajoom&act=cron >/dev/null 2>&1 However, even without the options, it doesn't seem to work. Am I missing something or are there better options for hitting it quick and dying with no output? 360 |
Have you tried using the full path to wget? Also, try sending the output to a file instead of /dev/null and see if it gives you some more info.
|
Probably a good idea to wrap the URL in 'quotes' as well.
Dave |
Hi,
Thanks for the tips... I tried them but no-luck. In doing more research, I found: --spider When invoked with this option, Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there. For example, you can use Wget to check your bookmarks: wget --spider --force-html -i bookmarks.html This feature needs much more work for Wget to get close to the functionality of real web spiders. So i took out all the features including dev/null. I included the direct path and tried it with and without quotes around the url. * * * * * /usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron" Should I see any output at the command prompt? How can I invoke crontab manually to run test? 360pro |
You can bypass cron and just run it from a console and see what output you get:
Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron" |
Quote:
It ran just fine... the php script was invoked and I received the output of the script via email. Thanks for your time on this matter. I will continue to find the bug in cron... 360 |
No problem - glad it helped...
|
Generally, if cron has a problem it'll email the root acct (or whoever owns the relevant crontab).
Try mailx as the crontab owner eg root Also, as mentioned try redirecting stdout/stderr to a file until you get it working. |
Quote:
As chrism01 posted, cron should mail you if it has a problem. If that isn't happening for whatever reason, you can try and force it with this (change the path to mail or mailx and the recipient name for your system): Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron" | /usr/bin/mail -s "Output of com_acajoom" root |
Solution
I was having the same problem as the person who started this thread. After trying about five million combinations, I found that
wget -q -O /dev/null "[your url here]" produces no output at all including the page itself. --spider wasn't working properly, or at least according to what I would expect. |
All times are GMT -5. The time now is 05:14 AM. |