Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
there's a myth of increasing download speeds through "multiple download slots" or some such technobabble.
i think it comes from certain commercial download sites like megaupload etc., that would limit bandwidth per user.
some addons/software claimed to be able to circumvent that restriction (and maybe did until the providers circumvented THAT again), but i just don't see how wget has anything to do with it.
there's a myth of increasing download speeds through "multiple download slots" or some such technobabble.
i think it comes from certain commercial download sites like megaupload etc., that would limit bandwidth per user.
some addons/software claimed to be able to circumvent that restriction (and maybe did until the providers circumvented THAT again), but i just don't see how wget has anything to do with it.
Even with multiple connections, it would still be using a single source and thus be limited. Since torrents use multiple sources, the throughput is much, much higher and limited only by the destination bandwidth, at least if there are enough seeds available.
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680
Rep:
Quote:
Originally Posted by ondoho
there's a myth of increasing download speeds through "multiple download slots" or some such technobabble.
Strangely enough in my limited testing I found that axel seems to be able to download quicker than wget from sites where the downkoad is slower than my internet connection. I have been sceptical about the issue so even with some limited evidence I'm not yet prepared to state that multiple connections is definitely faster but it seems there may be some truth in it under certain conditions at least.
i'm looking at search results for "download multi" from my package manager, and there's about a dozen AUR packages (all of them unsupported, as AUR packages always are).
i'm reading things like
"Xtreme Download Manager is a powerful tool to increase download speed up-to 500%, save videos from video sharing sites and integration with ANY browser."
and
"A full featured, advanced, multi-threaded, multisegment download manager and accelerator."
it all sounds a lot like snake oil to me, but so many coders can't be completely wrong i guess.
Distribution: Debian Sid AMD64, Raspbian Wheezy, various VMs
Posts: 7,680
Rep:
Quote:
Originally Posted by ondoho
it all sounds a lot like snake oil to me,
Oh, to me too and I was surprised when axel seemed faster than wget. I think it may depend very much on the reasons why wget isn't maxing out the link.
i now found axel, from the man page:
"Axel is a program that downloads a file from a FTP or HTTP server through multiple connection. Each connection downloads its own part of the file."
just like i thought, this only makes sense when the provider throttles bandwidth (and is stupid enough to not recognize this trick).
i now found axel, from the man page:
"Axel is a program that downloads a file from a FTP or HTTP server through multiple connection. Each connection downloads its own part of the file."
just like i thought, this only makes sense when the provider throttles bandwidth (and is stupid enough to not recognize this trick).
I think that's roughly how it works, yes, but I think it could also combat unintentional throttling due to server processes not been given enough resources or possibly processes scanning the outbound data. Sadly I don't know enough about this kind of thing so the previous may be nonsense but something like that would make sense to me. A bit like when, sometimes, it seems quicker to copy files from a file sever simultaneously rather than one after the other.
https://www.tecmint.com/download-managers-for-linux/
in general look for linux download manager - if interested.
wget itself is unable to use multiple connections. If you really want to use wget just do, but there will be only single connections or probably you can try to implement that feature into it.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.