Script to download file using wget
Hi
I need a Shell script that will download a text file every second from a http server using wget. Can anyone provide me any pointers or sample scripts that will help me go about this task ??? regards techie |
Ever second? You're sure? That seems too frequent to me.
|
Will it be downloading the same file name (as in refreshing it) every second, or will it be a different file name every time?
|
That is way too frequent, in one second that first wget hasnt stopped and you will end up with dozens of wget's running.
And all trying to write to that same file ;) |
As the other posts have said, you probably do not really want this, but if you do, below is an sample script that might help. Every programmer is entitled to enough rope to hang oneself. Rather than ask for the file every second, wait a second after each retrieval of the file before asking for it again. The bigger the file retrieved, the longer actual time between each start of the download.
#!/bin/bash counter=0 limit=100 while [ "$counter" -lt "$limit" ] do wget --output-document=retrieved_file http://www.google.com sleep 1 # wait 1 second counter=`expr $counter + 1` done exit 0 |
If you run this script every second, watch the system load. I suspect it's going to grow ;-)
|
Hello!
`rsync' might be a good choice if you want to update more than one file and do not want to reload all files every time (I mean you can download only diff of directory contents). This command will synchronize contents of <to-url> with <from-url> through encrypted ssh channel: Code:
rsync -ave ssh <from-url> <to-url> Also you can use `cron' for sheduling purposes. |
All times are GMT -5. The time now is 03:06 AM. |