Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
09-06-2014, 03:55 AM
|
#1
|
Member
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207
Rep: 
|
wget parameters
Hi,
Can you help me a bit?
I have a Raspberry Pi and I run a script on it in every minute from cron.
The problem is that 90% of the day the ftp server is not available.
In the scipt I put an if which pings the server, and only when available jumps to this line:
wget -mnd --read-timeout=5 --tries=3 --user=$USER --password=$PASS ftp://$SERVER_IP:$SERVER_PORT/$SERVER_SD
But when i disconnect in the middle of downloading, wget just waits and waits, even if I told them the timeout to 5 minutes, and tries 3.
I would like wget to behave this way:
if during the downloading process the server disapears, than wait for instance 5 seconds, drop the file which was partially downloaded and let the script go forward.
(I download 10-50 MB mkv files from the server, so I don't care if 1 file will be redownloaded at the next time)
Last edited by kzo81; 09-06-2014 at 07:51 AM.
Reason: mispelling
|
|
|
09-06-2014, 12:02 PM
|
#2
|
Member
Registered: Jun 2003
Location: Washington
Distribution: Raspbian, Ubuntu, Chrome/Crouton
Posts: 374
Rep:
|
Try adding a --connect-timeout as well. You only have a timeout on the read. Your problem description is somewhat ambiguous. Sounds like wget hangs after the "disconnect" and then "just waits and waits" (on the next connect attempt).
|
|
1 members found this post helpful.
|
09-06-2014, 07:48 PM
|
#3
|
LQ Addict
Registered: Dec 2013
Posts: 19,872
|
this rings a bell somewhere...
i think someone else was reporting some similar problem in connection with conky weather forecast scripts...
and the ultimate solution was to use curl instead.
|
|
|
09-07-2014, 03:23 AM
|
#4
|
Member
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207
Original Poster
Rep: 
|
--timeout
Quote:
Originally Posted by pwalden
Try adding a --connect-timeout
|
Thanks, I've read this in the meantime in the manual.
Today I did --timeout=3 which will refer to all kind of timeout, read, dns etc. and this way it does what I wanted, wget drops the connection and the script can proceed ahead.
Last edited by kzo81; 09-08-2014 at 10:24 AM.
Reason: misspelling, ongoing testing
|
|
|
09-07-2014, 03:25 AM
|
#5
|
Member
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207
Original Poster
Rep: 
|
curl
Quote:
Originally Posted by ondoho
...the ultimate solution was to use curl instead.
|
Ok, if I don't succeed with the --timeout parameter, I'll try with curl. I'm already tried lftp but I didn't like it 
|
|
|
09-07-2014, 04:01 AM
|
#6
|
LQ Guru
Registered: Jul 2011
Location: California
Distribution: Slackware64-15.0 Multilib
Posts: 6,565
|
Use the -c flag as well so wget can restart a partial download where it left off.
|
|
1 members found this post helpful.
|
09-07-2014, 04:42 AM
|
#7
|
Member
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207
Original Poster
Rep: 
|
Quote:
Originally Posted by ReaperX7
Use the -c flag
|
I know of this flag, thanks. In reality the video server creates 10-50MB files, and I wouldn't care if wget doesn't continue from where it left off. Let say 20MB of a file has already been downloaded, I don't know what wget does with a partially downloaded file without -c argument. In my case it's ok if wget deletes that partially downloaded file, and at the next time the server is available, downloads it again. But I'll be experimentng with the -c flag as well.
Here's my little script if you are interested: http://paste.debian.net/119687/
I just want wget not to wait forever if the server disapears. I run this script from cron in every 5 minutes.
Last edited by kzo81; 09-07-2014 at 04:44 AM.
Reason: additional thought
|
|
|
09-07-2014, 08:58 AM
|
#8
|
LQ Guru
Registered: Jul 2011
Location: California
Distribution: Slackware64-15.0 Multilib
Posts: 6,565
|
Usually it adds a .1, .2, etc tag if you rerun it.
|
|
|
09-07-2014, 09:07 AM
|
#9
|
Member
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207
Original Poster
Rep: 
|
Yeah, but I don't want that numbering:-)
|
|
|
09-08-2014, 01:01 AM
|
#10
|
LQ Addict
Registered: Dec 2013
Posts: 19,872
|
kzo81, you'll probably have to wrap that wget-command into a shell script or something that will do some cleanup before wget starts downloading.
what i remember, someone had problems with exactly that usage scenario in wget: wget just hangs there when the internet connection drops in the middle of a download. the ultimate solution was to use sth else - e.g. curl (very common on gnu/linux systems).
|
|
1 members found this post helpful.
|
09-08-2014, 02:54 AM
|
#11
|
LQ Guru
Registered: Jul 2011
Location: California
Distribution: Slackware64-15.0 Multilib
Posts: 6,565
|
If you use -c it simply resumes the original download. If you've already downloaded a file it'll state the file is complete and exit.
|
|
1 members found this post helpful.
|
09-16-2014, 12:18 AM
|
#12
|
Member
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207
Original Poster
Rep: 
|
Thank you for you guys, I managed to eliminate the problems with wget. I only used --timeout paramater and now it behaves the way I like, I also simplified my script.
|
|
|
All times are GMT -5. The time now is 06:35 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|