Me and my friends have a website about games and stuff.
The web hotel we use clears the app-pool when it has been inactive for 15 minutes. We don't have that many users yet so this is a problem because it takes ages to load if it has been inactive.
I came up with the brilliant idea of calling the website with a script from my private server at home every 5 minutes or so. That way it would never go down.
I used some lines from another of my scripts:
wget -q -O - http://homepage.se/
cat /dev/null > /path/to/a/file.txt
and put it in crontab.
The script runs and all and I think it's better now but we can't know for sure if it is the scripts doing. Sometimes it's just as slow as ever and I start to think the script doesn't do anything.
I've tried adding more wget-lines to fetch some different pages from the site and now running it every 3rd minute. No difference..
Does anyone know another way to call a website in a way that would solve my problems? Am I using the wrong options with wget maybe?
I've tried lynx (and piping to file) but it didn't work.