LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 09-06-2014, 03:55 AM   #1
kzo81
Member
 
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207

Rep: Reputation: Disabled
wget parameters


Hi,

Can you help me a bit?

I have a Raspberry Pi and I run a script on it in every minute from cron.
The problem is that 90% of the day the ftp server is not available.
In the scipt I put an if which pings the server, and only when available jumps to this line:

wget -mnd --read-timeout=5 --tries=3 --user=$USER --password=$PASS ftp://$SERVER_IP:$SERVER_PORT/$SERVER_SD

But when i disconnect in the middle of downloading, wget just waits and waits, even if I told them the timeout to 5 minutes, and tries 3.
I would like wget to behave this way:
if during the downloading process the server disapears, than wait for instance 5 seconds, drop the file which was partially downloaded and let the script go forward.
(I download 10-50 MB mkv files from the server, so I don't care if 1 file will be redownloaded at the next time)

Last edited by kzo81; 09-06-2014 at 07:51 AM. Reason: mispelling
 
Old 09-06-2014, 12:02 PM   #2
pwalden
Member
 
Registered: Jun 2003
Location: Washington
Distribution: Raspbian, Ubuntu, Chrome/Crouton
Posts: 374

Rep: Reputation: 50
Try adding a --connect-timeout as well. You only have a timeout on the read. Your problem description is somewhat ambiguous. Sounds like wget hangs after the "disconnect" and then "just waits and waits" (on the next connect attempt).
 
1 members found this post helpful.
Old 09-06-2014, 07:48 PM   #3
ondoho
LQ Addict
 
Registered: Dec 2013
Posts: 19,872
Blog Entries: 12

Rep: Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053
this rings a bell somewhere...
i think someone else was reporting some similar problem in connection with conky weather forecast scripts...
and the ultimate solution was to use curl instead.
 
Old 09-07-2014, 03:23 AM   #4
kzo81
Member
 
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207

Original Poster
Rep: Reputation: Disabled
Talking --timeout

Quote:
Originally Posted by pwalden View Post
Try adding a --connect-timeout
Thanks, I've read this in the meantime in the manual.

Today I did --timeout=3 which will refer to all kind of timeout, read, dns etc. and this way it does what I wanted, wget drops the connection and the script can proceed ahead.

Last edited by kzo81; 09-08-2014 at 10:24 AM. Reason: misspelling, ongoing testing
 
Old 09-07-2014, 03:25 AM   #5
kzo81
Member
 
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207

Original Poster
Rep: Reputation: Disabled
curl

Quote:
Originally Posted by ondoho View Post
...the ultimate solution was to use curl instead.
Ok, if I don't succeed with the --timeout parameter, I'll try with curl. I'm already tried lftp but I didn't like it
 
Old 09-07-2014, 04:01 AM   #6
ReaperX7
LQ Guru
 
Registered: Jul 2011
Location: California
Distribution: Slackware64-15.0 Multilib
Posts: 6,565
Blog Entries: 15

Rep: Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121
Use the -c flag as well so wget can restart a partial download where it left off.
 
1 members found this post helpful.
Old 09-07-2014, 04:42 AM   #7
kzo81
Member
 
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by ReaperX7 View Post
Use the -c flag
I know of this flag, thanks. In reality the video server creates 10-50MB files, and I wouldn't care if wget doesn't continue from where it left off. Let say 20MB of a file has already been downloaded, I don't know what wget does with a partially downloaded file without -c argument. In my case it's ok if wget deletes that partially downloaded file, and at the next time the server is available, downloads it again. But I'll be experimentng with the -c flag as well.
Here's my little script if you are interested: http://paste.debian.net/119687/
I just want wget not to wait forever if the server disapears. I run this script from cron in every 5 minutes.

Last edited by kzo81; 09-07-2014 at 04:44 AM. Reason: additional thought
 
Old 09-07-2014, 08:58 AM   #8
ReaperX7
LQ Guru
 
Registered: Jul 2011
Location: California
Distribution: Slackware64-15.0 Multilib
Posts: 6,565
Blog Entries: 15

Rep: Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121
Usually it adds a .1, .2, etc tag if you rerun it.
 
Old 09-07-2014, 09:07 AM   #9
kzo81
Member
 
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207

Original Poster
Rep: Reputation: Disabled
Yeah, but I don't want that numbering:-)
 
Old 09-08-2014, 01:01 AM   #10
ondoho
LQ Addict
 
Registered: Dec 2013
Posts: 19,872
Blog Entries: 12

Rep: Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053Reputation: 6053
kzo81, you'll probably have to wrap that wget-command into a shell script or something that will do some cleanup before wget starts downloading.

what i remember, someone had problems with exactly that usage scenario in wget: wget just hangs there when the internet connection drops in the middle of a download. the ultimate solution was to use sth else - e.g. curl (very common on gnu/linux systems).
 
1 members found this post helpful.
Old 09-08-2014, 02:54 AM   #11
ReaperX7
LQ Guru
 
Registered: Jul 2011
Location: California
Distribution: Slackware64-15.0 Multilib
Posts: 6,565
Blog Entries: 15

Rep: Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121Reputation: 2121
If you use -c it simply resumes the original download. If you've already downloaded a file it'll state the file is complete and exit.
 
1 members found this post helpful.
Old 09-16-2014, 12:18 AM   #12
kzo81
Member
 
Registered: Aug 2014
Location: Hungary
Distribution: Debian, Linux Mint, CentOS
Posts: 207

Original Poster
Rep: Reputation: Disabled
Smile

Thank you for you guys, I managed to eliminate the problems with wget. I only used --timeout paramater and now it behaves the way I like, I also simplified my script.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] Why there are two separate parameters in /proc/sys/net/ipv4/ for security parameters gprathap1121@gmail.com Linux - Security 4 07-14-2014 12:41 AM
[SOLVED] Loop through list of URLs in txt file, parse out parameters, pass to wget in bash. dchol Linux - Newbie 16 07-27-2011 02:19 PM
How to resume an interrupted wget using wget.log? misterJ Linux - Software 2 06-19-2011 01:21 PM
[SOLVED] wget passing URL parameters doesn't seem to work nonshatter Linux - Software 3 10-04-2010 07:48 AM
wget not working! but "man wget is" ??? wrapster Solaris / OpenSolaris 5 07-30-2008 03:00 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 06:35 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration