LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Networking (https://www.linuxquestions.org/questions/linux-networking-3/)
-   -   Large http downloads hang (https://www.linuxquestions.org/questions/linux-networking-3/large-http-downloads-hang-933567/)

kennethf 03-09-2012 04:13 AM

Large http downloads hang
 
Large http and https downloads stop after a minute or so. The browser (doesn't matter which) and bash shell do the same. User doesn't matter.

The weird part is that when I do the download in Firefox, the download window hangs as usual but when I press [pause] then [resume], downloading continues where it left off.

ftp downloads don't have this problem so huge files can be downloaded without user intervention.

It behaves as though http(s) downloads require periodic user intervention. Is this configurable somewhere? TIA

Mandriva 2011.0
KDE 4.6.5

njlinuxmike 03-09-2012 11:55 AM

Do you observer that the FTP and HTTP speeds are similar or dissimilar? If the FTP connection is using less bandwidth... I would suspect it may be your router that is hanging and not actually Firefox. You can also test and rule out Firefox as the culprit by attempting the same download (that fails in the browser) on the command line with wget. If wget works fine, then perhaps its a bug in FF.

Cheers

Mike

kennethf 03-10-2012 12:15 AM

It doesn't hang in Windows nor while downloading large binaries in newsgroups so the hardware can be ruled out.

It does hang using the Linux command line and browsers other than Firefox. I'm wondering if Linux allows for rules on certain ports, for example port 80 (http) pauses but port 20 (ftp) doesn't.

njlinuxmike 03-10-2012 12:25 AM

Quote:

Originally Posted by kennethf (Post 4623186)
It doesn't hang in Windows nor while downloading large binaries in newsgroups so the hardware can be ruled out.

It does hang using the Linux command line and browsers other than Firefox. I'm wondering if Linux allows for rules on certain ports, for example port 80 (http) pauses but port 20 (ftp) doesn't.

There are ways to throttle traffic in linux, Ive never heard of deliberately causing pauses though. Either way, Im positive no distro would implement that out of the box. So unless you did it I think its safe to say that's not it.

Are you wireless or wired? Could be a driver bug either way.

kennethf 03-10-2012 12:39 AM

Wired. I'm sure no distro would do it out of the box either. I'm just guessing.

Does http use a cache or buffer somewhere that's maybe not big enough?

njlinuxmike 03-10-2012 12:42 AM

Quote:

Originally Posted by kennethf (Post 4623200)
Wired. I'm sure no distro would do it out of the box either. I'm just guessing.

Does http use a cache or buffer somewhere that's maybe not big enough?

Perhaps your browsers cache... browser memory... Firefox? Do some googling and check the "about:config" settings to see if there is a culprit in there. Also, you can watch for what top tells you while attempting a download, see if you find something suspicious. I do remember the being a few memory.cache type settings in the browser.

Good luck.

mgichoga 03-10-2012 03:23 PM

what about using wget to download the file and log the resulting activity? This will tell you if it's a network problem or application specific issue.

wget https://someurl/somefile.zip --output-file=logfile

kennethf 03-11-2012 06:51 PM

I was finally able to complete the download inside the bash shell by periodically pressing ctrl-s (pause) and ctrl-q (resume) until it finished. Just like pressing the pause and resume keys while downloading with the GUI.

Somehow it limits the amount of http(s) data it will permit until more is explicitly requested.

Thanks. I'm still looking for a real solution.


All times are GMT -5. The time now is 03:14 PM.