Share your knowledge at the LQ Wiki.
Go Back > Forums > Non-*NIX Forums > Programming
User Name
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.


  Search this Thread
Old 02-13-2007, 08:53 AM   #1
Registered: Jul 2006
Posts: 36

Rep: Reputation: 15
Script to download file using wget


I need a Shell script that will download a text file every second from a http server using wget.

Can anyone provide me any pointers or sample scripts that will help me go about this task ???


Old 02-13-2007, 09:09 AM   #2
Senior Member
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
Ever second? You're sure? That seems too frequent to me.
Old 02-13-2007, 09:10 AM   #3
LQ Newbie
Registered: Feb 2007
Distribution: Vector Linux
Posts: 8

Rep: Reputation: 0
Will it be downloading the same file name (as in refreshing it) every second, or will it be a different file name every time?
Old 02-13-2007, 09:30 AM   #4
Registered: Nov 2005
Location: Land of Linux :: Finland
Distribution: 2 x Arch x86_64 | OpenBSD 6.7 bridge | Fedora 31 | Virtualbox Guest windows 10 pro
Posts: 470

Rep: Reputation: 206Reputation: 206Reputation: 206
That is way too frequent, in one second that first wget hasnt stopped and you will end up with dozens of wget's running.

And all trying to write to that same file
Old 02-13-2007, 08:57 PM   #5
Valdis Grinbergs
Registered: Dec 2005
Distribution: Debian
Posts: 30

Rep: Reputation: 25
As the other posts have said, you probably do not really want this, but if you do, below is an sample script that might help. Every programmer is entitled to enough rope to hang oneself. Rather than ask for the file every second, wait a second after each retrieval of the file before asking for it again. The bigger the file retrieved, the longer actual time between each start of the download.


while [ "$counter" -lt "$limit" ]
wget --output-document=retrieved_file
sleep 1 # wait 1 second
counter=`expr $counter + 1`
exit 0
Old 02-13-2007, 11:44 PM   #6
Senior Member
Registered: Dec 2001
Location: 35.7480 N, 95.3690 W
Distribution: Debian, Gentoo, Red Hat, Solaris
Posts: 2,070

Rep: Reputation: 47
If you run this script every second, watch the system load. I suspect it's going to grow ;-)
Old 02-14-2007, 04:48 AM   #7
Registered: Mar 2006
Location: Ekaterinburg, Russia
Distribution: Debian, Ubuntu
Posts: 709

Rep: Reputation: 428Reputation: 428Reputation: 428Reputation: 428Reputation: 428

`rsync' might be a good choice if you want to update more than one file and do not want to reload all files every time (I mean you can download only diff of directory contents).

This command will synchronize contents of <to-url> with <from-url> through encrypted ssh channel:
rsync -ave ssh <from-url> <to-url>
Take a look at `man rsync'.

Also you can use `cron' for sheduling purposes.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
large download with wget -c michapma Linux - Software 4 07-20-2006 05:12 AM
WGET: How do I cancel a download? PionexUser Linux - Software 3 12-06-2005 12:30 PM
Perl Script to download file apt Programming 4 10-19-2005 08:33 AM
How to write a shell script to download a file via FTP? guarriman Linux - General 4 12-21-2004 09:31 AM
what to do with 5 parts of wget download Bruce Hill Linux - Software 2 09-11-2003 10:47 AM > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 11:41 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration