LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 02-13-2007, 08:53 AM   #1
linuxnewbie82
Member
 
Registered: Jul 2006
Posts: 36

Rep: Reputation: 15
Script to download file using wget


Hi

I need a Shell script that will download a text file every second from a http server using wget.

Can anyone provide me any pointers or sample scripts that will help me go about this task ???

regards

techie
 
Old 02-13-2007, 09:09 AM   #2
matthewg42
Senior Member
 
Registered: Oct 2003
Location: UK
Distribution: Kubuntu 12.10 (using awesome wm though)
Posts: 3,530

Rep: Reputation: 65
Ever second? You're sure? That seems too frequent to me.
 
Old 02-13-2007, 09:10 AM   #3
M0E-lnx
LQ Newbie
 
Registered: Feb 2007
Distribution: Vector Linux
Posts: 8

Rep: Reputation: 0
Will it be downloading the same file name (as in refreshing it) every second, or will it be a different file name every time?
 
Old 02-13-2007, 09:30 AM   #4
//////
Member
 
Registered: Nov 2005
Location: Land of Linux :: Finland
Distribution: Arch Linux && OpenBSD 7.4 && Pop!_OS && Kali && Qubes-Os
Posts: 824

Rep: Reputation: 350Reputation: 350Reputation: 350Reputation: 350
That is way too frequent, in one second that first wget hasnt stopped and you will end up with dozens of wget's running.

And all trying to write to that same file
 
Old 02-13-2007, 08:57 PM   #5
Valdis Grinbergs
Member
 
Registered: Dec 2005
Distribution: Debian
Posts: 30

Rep: Reputation: 25
As the other posts have said, you probably do not really want this, but if you do, below is an sample script that might help. Every programmer is entitled to enough rope to hang oneself. Rather than ask for the file every second, wait a second after each retrieval of the file before asking for it again. The bigger the file retrieved, the longer actual time between each start of the download.

#!/bin/bash

counter=0
limit=100
while [ "$counter" -lt "$limit" ]
do
wget --output-document=retrieved_file http://www.google.com
sleep 1 # wait 1 second
counter=`expr $counter + 1`
done
exit 0
 
Old 02-13-2007, 11:44 PM   #6
darthtux
Senior Member
 
Registered: Dec 2001
Location: 35.7480° N, 95.3690° W
Distribution: Debian, Gentoo, Red Hat, Solaris
Posts: 2,070

Rep: Reputation: 47
If you run this script every second, watch the system load. I suspect it's going to grow ;-)
 
Old 02-14-2007, 04:48 AM   #7
firstfire
Member
 
Registered: Mar 2006
Location: Ekaterinburg, Russia
Distribution: Debian, Ubuntu
Posts: 709

Rep: Reputation: 428Reputation: 428Reputation: 428Reputation: 428Reputation: 428
Hello!

`rsync' might be a good choice if you want to update more than one file and do not want to reload all files every time (I mean you can download only diff of directory contents).

This command will synchronize contents of <to-url> with <from-url> through encrypted ssh channel:
Code:
rsync -ave ssh <from-url> <to-url>
Take a look at `man rsync'.

Also you can use `cron' for sheduling purposes.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
large download with wget -c michapma Linux - Software 4 07-20-2006 05:12 AM
WGET: How do I cancel a download? PionexUser Linux - Software 3 12-06-2005 12:30 PM
Perl Script to download file apt Programming 4 10-19-2005 08:33 AM
How to write a shell script to download a file via FTP? guarriman Linux - General 4 12-21-2004 09:31 AM
what to do with 5 parts of wget download Bruce Hill Linux - Software 2 09-11-2003 10:47 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 07:49 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration