LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Programming (https://www.linuxquestions.org/questions/programming-9/)
-   -   Script for Wget, that attempts reconnect even on critical errors.... (https://www.linuxquestions.org/questions/programming-9/script-for-wget-that-attempts-reconnect-even-on-critical-errors-692502/)

Devcon 12-23-2008 09:54 AM

Script for Wget, that attempts reconnect even on critical errors....
 
The situation is as follows:

I am running wget to download a number of files, but the internet connection is not consistently connected. I would like for wget to periodically reconnect to the server even if there is no internet connection. Thus, when the internet comes back, it will start downloading again.

My computer is connected through a wireless network to a computer running XP setup with ICS. Users of the other computer will sometimes restart or shutdown the computer, severing my connection. This causes wget to obviously stop downloading, but the process does not quit forcing me to "killall wget" and then start over when the connection returns.

I would like to automate this. I have never really written a script so if anyone could help me out with creating a loop that may be able to solve this, it would be much appreciated.

Not to make it complicated, but sometimes, the wireless router will also be unplugged. Thus, I need to reconnect to the wireless network before having an internet connection. I use a fixed IP address (192.168.0.2) with an unsecured wireless connection with iwconfig. Can this be handled in the script, or maybe done with another script?

Thanks guys.

jschiwal 12-24-2008 03:24 AM

Read through the manpage for wget. There are options you may want such as -c (continue partially downloaded file) --tries (will keep trying to make a connection). Using these and other options, the wget command by itself may solve most of the problems.

I'm not certain of the return values for wget. If a return value indicates no network connection, then you could act on it with `ifup' or something similar inside a while loop. Wget can also produce a log, which you could also parse to determine more problems than indicated by a return value.

In a script you might use the return value of a command to control whether a while loop continues or an if command branches.

You can also test $? immediately after the command or assign $? to a status flag immediately after a command runs.

if grep exbit /etc/services; then echo "found it"; fi

if grep exbim /etc/services; then echo "found"; else echo "not found"

You can also use the case command.
Code:

grep exbim /etc/services
status=$?
case $status in 0) echo "found" ;;
                1) echo "not found" ;;
                2) echo "file doesn't exist" ;;
esac

If you have a single if - then - else structure look at || and &&

command1 && command2 || command3

If command1 is successful, execute command2 else execute command3.

Devcon 12-24-2008 08:46 AM

Yeah, I am currently using both of those options. I just set up an alias so it will always download the current queue in a specific directory. Unfortunately, the try option does not apply to critical errors (such as disconnection from the net). Thus, the process essentially freezes until wget is manually restarted.

From the manpage:
Code:

        -t number
      --tries=number
          Set number of retries to number.  Specify 0 or inf for infinite retrying.  The default is to retry 20 times, with the exception of fatal errors like "connection
          refused" or "not found" (404), which are not retried.

It appears that aria2c does by contrast have an option to retry on 404 errors. Perhaps I will give that a try. I used to use aria occasionally, but i think i switched due to the log outputs. I'll try it for a couple days and see it it maintains its connection.

I remember why I switched to wget. It was simply that I couldn't check download progress when running aria2 as a daemon. There is a script that supposedly solves this, but we'll see if it actually works.


All times are GMT -5. The time now is 09:20 PM.