LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > General
User Name
Password
General This forum is for non-technical general discussion which can include both Linux and non-Linux topics. Have fun!

Notices

Reply
 
Search this Thread
Old 09-23-2012, 07:45 PM   #1
273
Senior Member
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64&i386, Raspbian Wheezy, various VMs
Posts: 3,230

Rep: Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759
Download Accelerators. Any facts?


After a few threads on this, and other, forums regarding "download accelerators" and hearing them mentioned over the years I have finally snapped. I am calling Shenanigans.
What tipped me over the edge was a thread to which I responded:
Quote:
Originally Posted by 273 View Post
OK, I have a question:
If I'm maxing out my bandwidth dowloadihng something, how does having multi-thread technology with 32 connections help me download it faster? Similarly, if I'm maxing out the connection of the server I'm downloading from how does having multi-thread technology with 32 connections help me download it faster?

I've not seen an answer to that and until I do I'll treat download accelerators as snake oil.

The only way I can see to speed up a download is to get the other end to zip it if it isn't already zipped or to download the same file from multiple sources if the sources have poor upload speeds. Both only work in certain specific circumstances.
I've added this in General as it is a bit of a rant at the moment but if anyone has any genuine insight into the situation I'd be happy to see it in Networking.
Apologies if this has been gone over before but I see questions about "download accelerators" asked again and again and I see programs recommended but no explanation as to how they are supposed to work.
 
Old 09-23-2012, 07:58 PM   #2
exvor
Senior Member
 
Registered: Jul 2004
Location: Phoenix, Arizona
Distribution: LFS-Version SVN-20091202, Arch 2009.08
Posts: 1,484

Rep: Reputation: 66
You are correct they are pretty much snake oil. The only way I can think of is a downloader that can make multiple connections to a server and download different smaller parts to gain an increase in speed but only if you are not already maxing out the connection. This would of course have to be supported by the server and considering that your getting less then your max the server admin probably has limited upload speeds and wont allow this sort of shenanigans anyway. If the above sounds oddly familiar its because this pretty much describes what a torrent is .

Like you said however if your maxing out your connection a download accelerator is going to do nothing to improve it unless its using some kinda of black witchcraft magic or something.
 
Old 09-23-2012, 08:13 PM   #3
273
Senior Member
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64&i386, Raspbian Wheezy, various VMs
Posts: 3,230

Original Poster
Rep: Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759
Thanks. I'd like to hear from others too. Not that I don't believe you (I do, obviously as it's my current summation of the situation) but I really am interested in getting to the bottom of this. Particularly, I suppose, there may be edge cases I've not considered and as you allude to exvor possible abuse of servers to get around download speed restrictions.
I also do realise my knowledge of information transfer and networking is fairly basic so if I'm missing something lower level I'd love some information. I'd also like a thread to point people to when they ask about downloading software since the ones I have seen seem to be filled with conjecture (I'm including my own comments in this).
 
Old 09-23-2012, 08:28 PM   #4
ntubski
Senior Member
 
Registered: Nov 2005
Distribution: Debian
Posts: 2,396

Rep: Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814
Hypothetically, if the server is using load balancing, the multiple connections could each connect to a different physical box.

In practice, I expect download accelerators make downloads go faster like racing stripes make cars go faster. There was a thread on LQ several years back, where a guy thought he could "hack" a download to go faster by exploiting the fact the reported speed in the beginning of download was higher. People tried to explain that it's simply an inaccurate estimate, but I don't think he clued in. If I remember correctly, the thread was closed for hacking.

Last edited by ntubski; 09-23-2012 at 08:29 PM. Reason: grammar
 
Old 09-23-2012, 08:33 PM   #5
273
Senior Member
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64&i386, Raspbian Wheezy, various VMs
Posts: 3,230

Original Poster
Rep: Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759
Quote:
Originally Posted by ntubski View Post
Hypothetically, if the server is using load balancing, the multiple connections could each connect to a different physical box.
I'd not thought of that one. I can see that working under slightly odd circumstances where it's the link to the individual servers that's limited and not the link from the load balancer to the internet.
 
Old 09-24-2012, 12:45 AM   #6
dugan
Senior Member
 
Registered: Nov 2003
Location: Canada
Distribution: distro hopper
Posts: 4,571

Rep: Reputation: 1394Reputation: 1394Reputation: 1394Reputation: 1394Reputation: 1394Reputation: 1394Reputation: 1394Reputation: 1394Reputation: 1394Reputation: 1394
I never use them. Seriously, if the download site is so bad that a DL accelerator has any measurable impact on the speed, then you need to find an alternative DL site anyway.

Last edited by dugan; 09-24-2012 at 01:24 AM.
 
Old 09-24-2012, 01:42 AM   #7
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
I don't know exactly how they work (in fact they may work in different ways), but if they are downloading the same file from the same server, download speed will not increase significantly by using multiple connections.
 
Old 09-24-2012, 10:34 AM   #8
rizzy
Member
 
Registered: Mar 2004
Distribution: Debian
Posts: 283

Rep: Reputation: 65
Not download accelerator per se but a download managers can be helpful. Although they there often touted as accelerators.
The features i used with download accelerators were resume and schedule options, even that wasn't supported by many servers. Back in the days of 56K modems and early unstable ADSLs, resume and night downloading were useful. Some DLA would shut down PC after download complete too, also usefull. I could still see some use for such tools - living in a crowded household, dont want to kill connection during days - schedule night downloading or one could set rules for a traveling laptop - download only when, say, on home WiFi only.
As for accual speed increase - snake oil is well put
 
Old 09-24-2012, 10:23 PM   #9
konsolebox
Senior Member
 
Registered: Oct 2005
Distribution: Gentoo, Slackware, LFS
Posts: 2,245
Blog Entries: 15

Rep: Reputation: 233Reputation: 233Reputation: 233
You'll notice the significance of it if you find yourself competing with other users - that which I always experience. And sometimes it's not only about your local network or your local ISP, it's also about the routers between it. Such matters don't always have something to do with the server.

Last edited by konsolebox; 09-24-2012 at 10:24 PM.
 
Old 09-24-2012, 10:32 PM   #10
273
Senior Member
 
Registered: Dec 2011
Location: UK
Distribution: Debian Sid AMD64&i386, Raspbian Wheezy, various VMs
Posts: 3,230

Original Poster
Rep: Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759Reputation: 759
Quote:
Originally Posted by konsolebox View Post
You'll notice the significance of it if you find yourself competing with other users - that which I always experience. And sometimes it's not only about your local network or your local ISP, it's also about the routers between it. Such matters don't always have something to do with the server.
How does having more connections help you compete? Does the server allot a certain bandwidth per connection? What logic is employed at the server end to allot bandwidth? Do the routers inbetween allot bandwidth per connection?
I started this thread because I've seen a lot of talk about download accelerators but no actual mathematics or computer science to back it up. I don't like snake oil and I want some real answers from people who know about low level network protocols.
Apologies for being so blunt but I've not managed to find anywhere getting to the bottom of this and I would love to have a thread to point to with facts from developers and those who understand the details of protocol.
 
Old 09-24-2012, 11:55 PM   #11
konsolebox
Senior Member
 
Registered: Oct 2005
Distribution: Gentoo, Slackware, LFS
Posts: 2,245
Blog Entries: 15

Rep: Reputation: 233Reputation: 233Reputation: 233
I actually made a simple explanation for that before already, but it seems you needed a more detailed schematic. Well I actually could prove that -further- if I really want to, but I really find the effort really not necessary, as it seems that common sense and imagination would be enough to convince myself. And please don't think I don't consider the technical matters of networking. Not an expert but I did study the concepts of it before, especially how one could implement an asynchronous transfer mode as to not affect the stability of the connections from overheads in overloads. And like I said before, it depends on the situation. I also mentioned that it depends on the implementation of every router. But yet again as I said, it -does- help in situations.

You're just skeptic and conservative without balanced or with narrow approach to one's side. Anyway if you want to really know, why don't you make a test yourself. First set up two computers in one network with a fixed bandwidth no other systems sharing on it. On one computer, let it download large files from multiple servers, let say 10, 20 or 30. Then on the other computer, just download one file. Observe the average connection speed of that computer for about a minute or two. Then afterwards, try to increase the segments to increase the number of connections. Then from that point, take the average speed again. If it -didn't- increase in connection speed, I was wrong, and you were right. If it did, perhaps you clearly had to reconsider your perception about this matter. Why would you need more details about it? I doesn't make sense, but if you do, you could just study them anyway. Those ideas are not difficult to find and understand, if it really interests you. One here could tell an explanation or two but it really differs from router and connection system to another.
 
Old 09-25-2012, 02:51 AM   #12
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
I think it is time for benchmarks. Any download accelerators that people use on Linux ? I want to benchmark them. I will also search for them and do it myself.
 
Old 09-25-2012, 03:07 AM   #13
konsolebox
Senior Member
 
Registered: Oct 2005
Distribution: Gentoo, Slackware, LFS
Posts: 2,245
Blog Entries: 15

Rep: Reputation: 233Reputation: 233Reputation: 233
H_TeXMeX_H: DownThemAll is fine. There's Axel as well.
 
Old 09-25-2012, 03:13 AM   #14
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 3,888

Rep: Reputation: 774Reputation: 774Reputation: 774Reputation: 774Reputation: 774Reputation: 774Reputation: 774
while it is certainly correct that there is a fair amount of snake oil in this field (probably more in field of Windows download accelerators, where, from what I remember, 'download accelerator' was often a synonym for 'ad- or mal- ware supported program that promises waaay more than it can actually deliver), there are a few constructive things that can happen:
  • Data can be compressed - compression doesn't always work (data that is already very effectively compressed, for example) and the server(s) at the far end have to understand what is going on, and also there is the compression and decompression time to take into account, but this is something that can help in favourable circumstances
  • The bit torrent type approach can help when server load at the far end would be otherwise be an issue. Of course, if the bottleneck is elsewhere, such as your connection to the internet, then obviating a problem that you don't have would be less helpful.
  • Having suspend 'n resume available is probably always constructive, if the download is in imminent danger of faltering
  • At one point, I was experiencing slow data rates with Opera and a hotspot. (A long time) Previously, I had tested Opera and it had seemed OK, but what was happening this time was a lot of swapping going on and the data rate was only about a third of what it otherwise might be, because it was only getting data for about a third of the time. In this case, a quick change to wget cured the problem, but it does show that there are circumstances in which a lighter download app can make a difference
  • As part of the same issue, I tried the Opera turbo feature (it doesn't say whether this works for downloads, but it seemed worth trying, as it seems to use the compression approach). It didn't do anything for me, but then it was suffering from the swapping problem described above, so it might have worked otherwise. Or, it might not, I can't really be sure.

While being wildly optimistic about Download Accelerators is probably wrong, there are some specific circumstances in which there might be some mileage out of one that took an approach that dealt with the problem that you are having. Whether that is the same problem that anyone else has, and so whether it works for them because it works for you is an open question.
 
Old 09-25-2012, 04:01 AM   #15
H_TeXMeX_H
Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269Reputation: 1269
Alright, the benchmarks are here, these were done alternating wget with axel just in case server load varies over time. It was the same file from the same server every time.

Code:
wget:

2012-09-25 11:51:38 (774 KB/s) - "edgar-1.04-1.tar.gz" saved [26894724/26894724]

real	0m34.377s
user	0m0.058s
sys	0m0.292s

2012-09-25 11:54:09 (691 KB/s) - "edgar-1.04-1.tar.gz" saved [26894724/26894724]


real	0m38.343s
user	0m0.057s
sys	0m0.289s

2012-09-25 11:55:40 (746 KB/s) - "edgar-1.04-1.tar.gz" saved [26894724/26894724]


real	0m35.544s
user	0m0.057s
sys	0m0.302s

axel:

Downloaded 25.6 megabytes in 21 seconds. (1207.53 KB/s)

real	0m21.999s
user	0m0.069s
sys	0m0.390s

Downloaded 25.6 megabytes in 21 seconds. (1224.89 KB/s)

real	0m21.690s
user	0m0.067s
sys	0m0.393s

Downloaded 25.6 megabytes in 20 seconds. (1306.36 KB/s)

real	0m20.352s
user	0m0.065s
sys	0m0.412s

wget	axel	wget	axel
34.377	21.999	774	1207
38.343	21.69	691	1224
35.544	20.352	746	1306
average	average	average	average
36	21	737	1246
stdev	stdev	stdev	stdev
2	1	42	53
seconds	seconds	KB/s	KB/s
one-tail		one-tail
0.00358247966428	0.00285869349649
two-tail		two-tail
0.00716495932856	0.00571738699297
I guess it is NOT snake oil in this case. In fact the t-tests show a significant difference between the two.

Last edited by H_TeXMeX_H; 09-25-2012 at 04:05 AM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
download accelerators p2p emu103 Linux - Software 1 04-04-2007 06:24 AM
Anyone had any experence with php accelerators before? ziggie216 Linux - Software 0 12-14-2005 02:19 AM
NewsForge: The Facts Behind the "Get the Facts" Ad Campaign XavierP Linux - News 1 07-03-2005 11:20 AM
Limit browser connections to Apache and stop download accelerators Moloko Linux - Software 0 02-07-2005 07:38 AM
Download Accelerators for linux... Eugene Linux - Software 5 12-06-2003 10:25 AM


All times are GMT -5. The time now is 10:30 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration