internet download manager alternative in linux
I have used windows xp in which there was useful software called internet download manager (called IDM) which segments the file while downloading so multiple different parts of same file get downloaded simultaneously decreasing time taken to download.
Now I have ubuntu 8.04 & i use wget which is fine (because I love commandline) but I think it downloads file serially. Is there any software which acts like IDM ? |
Here is a good download manager for linux:
http://linux.softpedia.com/get/Inter...ader-834.shtml |
I downloaded LinuxInstaller.bin then made it executable using
#chmod +x LinuxInstaller.bin But # ~lxuser/Desktop/LinuxInstaller.bin Preparing to install... Extracting the installation resources from the installer archive... The size of the extracted files to be installed are corrupted. Please try to download the installer again and make sure that you download using 'binary' mode. Please do not attempt to install this currently downloaded copy. File size is 1474560 byte with md5sum b2b0d9bf8d5e7e152edc5b07e60a281c /home/lxuser/Desktop/LinuxInstaller.bin I have also noticed that its 11MB(jre inclusive) windows executable file says license expired. |
Quote:
|
I ran
Code:
$wget "http://prdownloads.sourceforge.net/qdown/LinuxInstaller.bin" i did Code:
~/Desktop/OSIndependent $ chmod +x RunMe.sh;export JAVA_HOME=/usr/;./RunMe.sh |
At this point I would suggest trying out something else, if you cannot verify that the file you are downloading is indeed fine (at the server end as well). If there is no place to ask the software creator, move on, or if there is, ask there directly about why it appears to fail even though there seems to be no trouble in the download.
More generally I recommend not to use these "download managers" in the sense you're referring to (acceleration of the download). In short, it is not how things were supposed to work, and if you benefit from it, it might hinder others. At wget FAQ it is stated like this: Quote:
|
If you like wget take a look at axel.
Axel tries to accelerate the downloading process by using multiple connections for one file. It can also use multiple mirrors for one download. Axel tries to be as light as possible (25-30k in binary form), so it might be useful as a wget clone on byte-critical systems. Homepage: http://axel.alioth.debian.org/ |
to b0uncer
Your quote about multiple connections concept against RFC 2616 was noteworthy. to craigevil I have installed axel & it works fine. Only one thing is I downloaded a file then got following messages Code:
[ 5%] .......... .......... .......... .......... .......... [ 91.2KB/s] I also think Downloading using firefox browser isn't reliable like I got 41MB file when it had to be 119MB.In case of 'wget' you get number of bytes count at beginning. |
hey guys did u tried down them all! (DTA) ?
Its a firefox extension, but a dead useful download manager and it's free. I dont know about multiple connections very much but so far i can remember it does. its also very speedy. Quiet easy to use with firefox. I mean, for the big downloads. https://addons.mozilla.org/en-US/firefox/addon/201/ |
Down them all . I have used that extension especially useful to grab all files when browsing on server.
I like wget because it shows the real communication. Also commandline is superior over graphical interface mostly , I believe. After all commands form foundation of linux OS & available in any flavour of distro. |
Internet Download manager for Linux
I was googling around for the best download manager out there for Linux, but many of what I've tried have failed me.
Until I found flareGet and its so awesome -still in beta though. It's truly an IDM alternative app for Linux and it uses multi-thread technology of up to 32 segments/connections. http://flareget.net16.net |
Since you revived the thread anyway, how bout uget?
And seems DownThemAll was doing great since.. |
Quote:
If I'm maxing out my bandwidth dowloadihng something, how does having multi-thread technology with 32 connections help me download it faster? Similarly, if I'm maxing out the connection of the server I'm downloading from how does having multi-thread technology with 32 connections help me download it faster? I've not seen an answer to that and until I do I'll treat download accelerators as snake oil. The only way I can see to speed up a download is to get the other end to zip it if it isn't already zipped or to download the same file from multiple sources if the sources have poor upload speeds. Both only work in certain specific circumstances. |
32 is already extreme and doesn't help much. Also, it causes too much overheads e.g. when a download of a segment is full the server would still send data even though the client would no longer accept it until the server just considers it as timeout. This also causes stress to the server and some servers ban clients that abuse them. Sometimes, 4 is enough. Even 8 is too big already. DAs help sometimes, and sometimes they don't - especially with small files, but it depends on the connection set-up and the files being took.
|
I don't see how any number of separate connections between a client and a server can ever speed up the exchange of information between them if either side has reached its maximum rate.
I've downloaded files at ~100Mbps before (not from home, sadly) so demonstrably network protocols and hardware can deal with speeds of at least what most people can expect in a home setting. So, how does adding extra download threads help a network stack capable of scaling to ten times the speed I have help get things faster? The limiting factor in any network is the slowest connection point. You cannot work around this by trying to pull more data through it at the same time. |
All times are GMT -5. The time now is 01:56 AM. |