LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Networking
User Name
Password
Linux - Networking This forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.

Notices


Reply
  Search this Thread
Old 09-02-2013, 01:02 AM   #1
waddles
Member
 
Registered: Sep 2012
Posts: 372

Rep: Reputation: 1
extracting files in sections using wget


I am aware of AXEL but let's not go there. I have seen code using wget which extracts a couple of megabytes of the file from 1 site then another 2M from another site's identical file, and so on.
So am wondering why one could not do likewise to a single file at a single site and use http: as one carrier, ftp as a second carrier, and maybe https or some other ap as a 3rd carrier, etc. each with its on defined segment of the file to be downloaded. I am not familiar with how these apps carry out their downloads so do not know if one would interfere with another so am asking if this could be viable in any way? Maybe using a proxy system or somehow piggy back o bitorrent?
 
Old 09-03-2013, 05:22 PM   #2
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608
Quote:
Originally Posted by waddles View Post
I have seen code using wget which extracts a couple of megabytes of the file from 1 site then another 2M from another site's identical file, and so on.
Have you? What's the name? Just curious.


Quote:
Originally Posted by waddles View Post
So am wondering why one could not do likewise to a single file at a single site and use http: as one carrier, ftp as a second carrier, and maybe https or some other ap as a 3rd carrier, etc. (..)
...because in the end it does not matter which proto:// you use: you'll depend on what services the site offers (or not) but more importantly as long as the D/L comes from the same site all traffic goes through the same pipe so there's nothing to gain from that.
 
Old 09-04-2013, 09:50 PM   #3
waddles
Member
 
Registered: Sep 2012
Posts: 372

Original Poster
Rep: Reputation: 1
Try this for a source:
http://linuxgazette.net/issue70/chung.html
Adrian chung did a good review of techniques.
Else try a search with terms like: wget curl download file "multiple sources"
Don't think UR conclusion is totally correct. It is possible with curl to split a file numerically run one part using http:// in the background and another using ftp:// in the foreground. Yes there is a small delay from the server end but the big delay is at the receiving end where the parts may be fast getting to your ISP but between UR ISP and U it can slow to a crawl tho much faster than attempting to do a direct download of the file.
This is a technique used by those not on broadband.
Some improvement in downloading can be gained by examining:
http://www.ibiblio.org/pub/linux/doc...ing-HOWTO.html
Hope that helps.
 
Old 09-05-2013, 01:42 AM   #4
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608Reputation: 3608
Quote:
Originally Posted by waddles View Post
Try this for a source:
http://linuxgazette.net/issue70/chung.html
Adrian chung did a good review of techniques.
In 2001.


Quote:
Originally Posted by waddles View Post
Don't think UR conclusion is totally correct. It is possible with curl to split a file numerically run one part using http:// in the background and another using ftp:// in the foreground. Yes there is a small delay from the server end but the big delay is at the receiving end where the parts may be fast getting to your ISP but between UR ISP and U it can slow to a crawl tho much faster than attempting to do a direct download of the file. This is a technique used by those not on broadband.
I'd like to think it is. Examples may be valid for a multiple locations / one downloader scenario but not for a single location / connection scenario. Stuffing one cow in a grinder may work but that does not necessarily mean stuffing tucows in a grinder at the same time will work.


Quote:
Originally Posted by waddles View Post
Some improvement in downloading can be gained by examining:
http://www.ibiblio.org/pub/linux/doc...ing-HOWTO.html
Exactly which part of the Adv-Routing-HOWTO are you referring to?
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
extracting files...installing files.....from *.tar.bz2 and *ipk vinodhkumarvk Linux - Newbie 19 02-27-2013 02:29 AM
Editing configuration files with [foo] like sections in them. arizonagroovejet Linux - General 2 10-23-2011 04:30 AM
reading files in C++, in sections WhiskeyTangoFoxtrot Programming 17 05-19-2006 12:14 AM
Catching sections in .conf files with C++ gamehack Programming 3 05-16-2006 08:48 AM
extracting .gz files MACSRULE Linux - Newbie 2 04-13-2004 06:16 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Networking

All times are GMT -5. The time now is 02:45 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration