LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 06-30-2011, 12:55 PM   #1
ScorchPipe
Member
 
Registered: Sep 2010
Posts: 38

Rep: Reputation: 0
Call website with bash


Hi

Me and my friends have a website about games and stuff.
The web hotel we use clears the app-pool when it has been inactive for 15 minutes. We don't have that many users yet so this is a problem because it takes ages to load if it has been inactive.

I came up with the brilliant idea of calling the website with a script from my private server at home every 5 minutes or so. That way it would never go down.

I used some lines from another of my scripts:
wget -q -O - http://homepage.se/ > /path/to/a/file.txt
cat /dev/null > /path/to/a/file.txt

and put it in crontab.
The script runs and all and I think it's better now but we can't know for sure if it is the scripts doing. Sometimes it's just as slow as ever and I start to think the script doesn't do anything.

I've tried adding more wget-lines to fetch some different pages from the site and now running it every 3rd minute. No difference..

Does anyone know another way to call a website in a way that would solve my problems? Am I using the wrong options with wget maybe?

I've tried lynx (and piping to file) but it didn't work.
 
Old 06-30-2011, 01:07 PM   #2
ButterflyMelissa
Senior Member
 
Registered: Nov 2007
Location: Somewhere on my hard drive...
Distribution: Manjaro
Posts: 2,766
Blog Entries: 23

Rep: Reputation: 411Reputation: 411Reputation: 411Reputation: 411Reputation: 411
Hey there!

That approach should do nicely! Downloading a tiny file and thus keeping the site "alive"...in fact:

Quote:
I came up with the brilliant idea of calling the website with a script
is the best way to describe this.
I'd do it like that myself! So...just what help (if any) did you need?

Happy gaming!

Thor
 
Old 06-30-2011, 01:15 PM   #3
ScorchPipe
Member
 
Registered: Sep 2010
Posts: 38

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by Thor_2.0 View Post
Hey there!

That approach should do nicely! Downloading a tiny file and thus keeping the site "alive"...in fact:



is the best way to describe this.
I'd do it like that myself! So...just what help (if any) did you need?

Happy gaming!

Thor
Well the site is still slow to load sometimes. So we're not sure it the script works.

So I wondered if anyone had a better solution
 
Old 06-30-2011, 01:43 PM   #4
aysheaia
LQ Newbie
 
Registered: Jun 2011
Distribution: Ubuntu
Posts: 26

Rep: Reputation: Disabled
Quote:
Originally Posted by ScorchPipe View Post
Does anyone know another way to call a website in a way that would solve my problems? Am I using the wrong options with wget maybe?
It depends on how the website defines "inactivity".

If it relies simply on a page call, your cron should be OK ; you could temporarily keep the result of each wget call in horodated files, for debug purposes.

But if it relies on more complex things like cookies, it is not enough. Do you know what is inactivity for your website ?
 
Old 06-30-2011, 03:34 PM   #5
MTK358
LQ 5k Club
 
Registered: Sep 2009
Posts: 6,443
Blog Entries: 3

Rep: Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723
How about manually visiting the site with a web browser every few minutes, just to see if it will fix the issue?

Quote:
Code:
wget -q -O - http://homepage.se/ > /path/to/a/file.txt
cat /dev/null > /path/to/a/file.txt
Why not just:

Code:
wget -q -O - http://homepage.se/ > /dev/null
?
 
Old 06-30-2011, 04:11 PM   #6
ScorchPipe
Member
 
Registered: Sep 2010
Posts: 38

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by aysheaia View Post
It depends on how the website defines "inactivity".

If it relies simply on a page call, your cron should be OK ; you could temporarily keep the result of each wget call in horodated files, for debug purposes.

But if it relies on more complex things like cookies, it is not enough. Do you know what is inactivity for your website ?
Hmm no I don't. I'll look into it
 
Old 06-30-2011, 04:13 PM   #7
ScorchPipe
Member
 
Registered: Sep 2010
Posts: 38

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by MTK358 View Post
How about manually visiting the site with a web browser every few minutes, just to see if it will fix the issue?



Why not just:

Code:
wget -q -O - http://homepage.se/ > /dev/null
?
Visiting manually does the job. Thats why I wanted to automate it.

Bash is not one of my stronger sides =)
 
Old 07-01-2011, 01:03 AM   #8
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,359

Rep: Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751
Post #4 is key; you need to know what constitutes activity (or not ...)
 
  


Reply

Tags
bash, call, website



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
export call in bash script Donald1000 Linux - Software 10 03-12-2009 08:03 PM
Can bash scripts call graphical windows? Gum Linux - Newbie 2 03-12-2009 05:45 AM
Bash script - call xterm and run in it. musther Programming 2 02-08-2008 07:57 PM
Trouble formulating a correct wget call to download a college website, notes and all. chuckleberry Linux - Software 7 12-28-2005 05:32 PM
Asking a C program to call a bash script sceadu Programming 4 07-28-2005 08:52 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 02:19 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration