LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 01-30-2014, 07:46 PM   #1
TheLexx
Member
 
Registered: Apr 2013
Distribution: Gentoo
Posts: 79

Rep: Reputation: Disabled
Multiple fallback URLs for single file wget/curl/other ?


I created a small Python script to selectively download files from a list. Each list has multiple URLS where the file was mirrored. In the script that I tossed together quickly, I preformed the downloading by just calling wget within a system call.I would like to improve my script by automating the script to select a second mirror location if the file can't be download from the first location.

Any suggestions on what what would be the easiest way to accomplish this goal? I looked into wget and there does not seem to have a facility to retrieve from a second or third location if and only if the first location fails. I was thinking about using the CLI version of CURL, but I did not see a retry URL in that either. I suppose I could import libCURL into python and use try/catch loop until it succeeds or runs out of URLs. But to be honest, I really don't want to do that much work for a script that does what I want it to 99.9% of the time.

Any suggestions for a quick and dirty solution for a quick and dirty script?
 
Old 02-01-2014, 06:13 AM   #2
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Code:
aria2c http://someUrl http://anotherUrl http://someOtherUrl
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] preserving file contents while using wget or curl amboxer21 Programming 7 10-20-2012 02:53 AM
[SOLVED] Loop through list of URLs in txt file, parse out parameters, pass to wget in bash. dchol Linux - Newbie 16 07-27-2011 02:19 PM
[SOLVED] Complex urls with wget pinecone Linux - Newbie 1 08-09-2010 05:24 PM
wget and php urls Orangutanklaus Linux - Software 4 10-02-2009 11:06 AM
Using wget or curl command to download from dynamic urls pratap.iisc Linux - Desktop 2 07-26-2009 10:55 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 05:29 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration