Multiple fallback URLs for single file wget/curl/other ?
I created a small Python script to selectively download files from a list. Each list has multiple URLS where the file was mirrored. In the script that I tossed together quickly, I preformed the downloading by just calling wget within a system call.I would like to improve my script by automating the script to select a second mirror location if the file can't be download from the first location.
Any suggestions on what what would be the easiest way to accomplish this goal? I looked into wget and there does not seem to have a facility to retrieve from a second or third location if and only if the first location fails. I was thinking about using the CLI version of CURL, but I did not see a retry URL in that either. I suppose I could import libCURL into python and use try/catch loop until it succeeds or runs out of URLs. But to be honest, I really don't want to do that much work for a script that does what I want it to 99.9% of the time.
Any suggestions for a quick and dirty solution for a quick and dirty script?
|