LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 08-15-2007, 05:19 PM   #1
360
Member
 
Registered: Jun 2001
Distribution: FC4
Posts: 136

Rep: Reputation: 15
using cron and wget to hit a php script


Hi,

I have a php script in a web page that needs to run every minute. I'd like to use cron and wget with no output; just hit it and die. In all my research, I've come up with the following and put it in /ect/crontab:

* * * * * wget -q --spider http://mysite.com/index.php?option=com_acajoom&act=cron >/dev/null 2>&1

However, even without the options, it doesn't seem to work. Am I missing something or are there better options for hitting it quick and dying with no output?

360

Last edited by 360; 08-15-2007 at 05:23 PM.
 
Old 08-15-2007, 05:39 PM   #2
gilead
Senior Member
 
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141

Rep: Reputation: 168Reputation: 168
Have you tried using the full path to wget? Also, try sending the output to a file instead of /dev/null and see if it gives you some more info.
 
Old 08-15-2007, 05:51 PM   #3
ilikejam
Senior Member
 
Registered: Aug 2003
Location: Glasgow
Distribution: Fedora / Solaris
Posts: 3,109

Rep: Reputation: 97
Probably a good idea to wrap the URL in 'quotes' as well.

Dave
 
Old 08-15-2007, 07:08 PM   #4
360
Member
 
Registered: Jun 2001
Distribution: FC4
Posts: 136

Original Poster
Rep: Reputation: 15
Hi,

Thanks for the tips... I tried them but no-luck.

In doing more research, I found:

--spider
When invoked with this option, Wget will behave as a Web spider, which means that it will not download the pages, just check that they are there. For example, you can use Wget to check your bookmarks:

wget --spider --force-html -i bookmarks.html

This feature needs much more work for Wget to get close to the functionality of real web spiders.

So i took out all the features including dev/null.

I included the direct path and tried it with and without quotes around the url.

* * * * * /usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron"

Should I see any output at the command prompt?

How can I invoke crontab manually to run test?

360pro
 
Old 08-15-2007, 07:22 PM   #5
gilead
Senior Member
 
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141

Rep: Reputation: 168Reputation: 168
You can bypass cron and just run it from a console and see what output you get:
Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron"
 
Old 08-15-2007, 07:29 PM   #6
360
Member
 
Registered: Jun 2001
Distribution: FC4
Posts: 136

Original Poster
Rep: Reputation: 15
Quote:
Originally Posted by gilead View Post
You can bypass cron and just run it from a console and see what output you get:
Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron"
gilead,

It ran just fine... the php script was invoked and I received the output of the script via email.

Thanks for your time on this matter.

I will continue to find the bug in cron...

360

Last edited by 360; 08-16-2007 at 12:04 AM.
 
Old 08-15-2007, 10:18 PM   #7
gilead
Senior Member
 
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141

Rep: Reputation: 168Reputation: 168
No problem - glad it helped...
 
Old 08-16-2007, 01:19 AM   #8
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,430

Rep: Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788Reputation: 2788
Generally, if cron has a problem it'll email the root acct (or whoever owns the relevant crontab).
Try
mailx
as the crontab owner eg root
Also, as mentioned try redirecting stdout/stderr to a file until you get it working.
 
Old 08-16-2007, 01:56 PM   #9
gilead
Senior Member
 
Registered: Dec 2005
Location: Brisbane, Australia
Distribution: Slackware64 14.0
Posts: 4,141

Rep: Reputation: 168Reputation: 168
Quote:
Originally Posted by 360 View Post
I will continue to find the bug in cron...
That'll teach me to read more carefully...

As chrism01 posted, cron should mail you if it has a problem. If that isn't happening for whatever reason, you can try and force it with this (change the path to mail or mailx and the recipient name for your system):

Code:
/usr/bin/wget "http://mysite.com/index.php?option=com_acajoom&act=cron" | /usr/bin/mail -s "Output of com_acajoom" root
 
Old 06-20-2012, 04:03 PM   #10
EricFleet
LQ Newbie
 
Registered: Jun 2012
Posts: 1

Rep: Reputation: Disabled
Solution

I was having the same problem as the person who started this thread. After trying about five million combinations, I found that

wget -q -O /dev/null "[your url here]" produces no output at all including the page itself. --spider wasn't working properly, or at least according to what I would expect.
 
  


Reply

Tags
cron, wget


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
shell script using /etc/cron.hourly to execute cron.php file? rioguia Programming 3 06-11-2008 08:09 AM
setting up a php cron using wget Roosta21 Linux - Software 1 04-18-2007 06:29 AM
Cron not Executing PHP script Badnoodles Linux - General 2 10-25-2006 04:56 PM
cannot get tu run a php script in cron alain Linux - General 6 02-06-2006 02:54 AM
cron NOT running php script ??!! hendrixx Programming 1 01-19-2005 05:36 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 03:25 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration