Script for Wget, that attempts reconnect even on critical errors....
The situation is as follows:
I am running wget to download a number of files, but the internet connection is not consistently connected. I would like for wget to periodically reconnect to the server even if there is no internet connection. Thus, when the internet comes back, it will start downloading again. My computer is connected through a wireless network to a computer running XP setup with ICS. Users of the other computer will sometimes restart or shutdown the computer, severing my connection. This causes wget to obviously stop downloading, but the process does not quit forcing me to "killall wget" and then start over when the connection returns. I would like to automate this. I have never really written a script so if anyone could help me out with creating a loop that may be able to solve this, it would be much appreciated. Not to make it complicated, but sometimes, the wireless router will also be unplugged. Thus, I need to reconnect to the wireless network before having an internet connection. I use a fixed IP address (192.168.0.2) with an unsecured wireless connection with iwconfig. Can this be handled in the script, or maybe done with another script? Thanks guys. |
Read through the manpage for wget. There are options you may want such as -c (continue partially downloaded file) --tries (will keep trying to make a connection). Using these and other options, the wget command by itself may solve most of the problems.
I'm not certain of the return values for wget. If a return value indicates no network connection, then you could act on it with `ifup' or something similar inside a while loop. Wget can also produce a log, which you could also parse to determine more problems than indicated by a return value. In a script you might use the return value of a command to control whether a while loop continues or an if command branches. You can also test $? immediately after the command or assign $? to a status flag immediately after a command runs. if grep exbit /etc/services; then echo "found it"; fi if grep exbim /etc/services; then echo "found"; else echo "not found" You can also use the case command. Code:
grep exbim /etc/services command1 && command2 || command3 If command1 is successful, execute command2 else execute command3. |
Yeah, I am currently using both of those options. I just set up an alias so it will always download the current queue in a specific directory. Unfortunately, the try option does not apply to critical errors (such as disconnection from the net). Thus, the process essentially freezes until wget is manually restarted.
From the manpage: Code:
-t number I remember why I switched to wget. It was simply that I couldn't check download progress when running aria2 as a daemon. There is a script that supposedly solves this, but we'll see if it actually works. |
All times are GMT -5. The time now is 09:20 PM. |