ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
I have a php script in a web page that needs to run every minute. I'd like to use cron and wget with no output; just hit it and die. In all my research, I've come up with the following and put it in /ect/crontab:
When invoked with this option, Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there. For example, you can use Wget to check your bookmarks:
wget --spider --force-html -i bookmarks.html
This feature needs much more work for Wget to get close to the functionality of real web spiders.
So i took out all the features including dev/null.
I included the direct path and tried it with and without quotes around the url.
Generally, if cron has a problem it'll email the root acct (or whoever owns the relevant crontab).
as the crontab owner eg root
Also, as mentioned try redirecting stdout/stderr to a file until you get it working.
As chrism01 posted, cron should mail you if it has a problem. If that isn't happening for whatever reason, you can try and force it with this (change the path to mail or mailx and the recipient name for your system):
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron" | /usr/bin/mail -s "Output of com_acajoom" root