LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 09-25-2014, 12:16 PM   #1
TruongAn
Member
 
Registered: Dec 2004
Location: Vietnam (Việt Nam)
Distribution: Gentoo (desktop), Arch linux (laptop)
Posts: 728

Rep: Reputation: 33
wget quit with memory exhausted while download large file recursively


I want to download a whole directory listing by apache on my server. I use "wget -r" for that. The directory contain some very large files so I specified -c switch to resume in case download process is interupted mid way. The exact option I used was:
Code:
wget  -c -r -np -nH -R 'index*' --cut-dirs=1
And then come the problem, when I resume downloading, if wget encounter some fully retrieved file, it will hang up and took a lot of memory before proceed to the next file. The larger the file, the more memory it took. And if the file was too large (8GiB in my case) wget stop with a memory exhausted messange. The last output it show me (in --debug mod) was
Code:
---response end---
416 Requested Range Not Satisfiable

    The file is already fully retrieved; nothing to do.

Disabling further reuse of socket 3.
Closed fd 3
The server was just an apache web server with default settings. I don't suspect the server was the cause and really has no more ideas where to look at. Please help

Last edited by TruongAn; 09-29-2014 at 11:46 AM.
 
Old 09-27-2014, 04:11 AM   #2
business_kid
LQ Guru
 
Registered: Jan 2006
Location: Ireland
Distribution: Slackware & Android
Posts: 10,693

Rep: Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187
If you ran out of memory, it is the kernel that would have shut your connection down, not apache.

That said, wget making an like that is strange. The code dates, afaik, from the days where memory wasn't as plentiful as it is today.

Edit: The compare function could be passed out to another utility (e.g. diff) which may be memory hungry.

Last edited by business_kid; 09-27-2014 at 04:14 AM.
 
Old 09-29-2014, 11:48 AM   #3
TruongAn
Member
 
Registered: Dec 2004
Location: Vietnam (Việt Nam)
Distribution: Gentoo (desktop), Arch linux (laptop)
Posts: 728

Original Poster
Rep: Reputation: 33
Quote:
Originally Posted by business_kid View Post
If you ran out of memory, it is the kernel that would have shut your connection down, not apache.

That said, wget making an like that is strange. The code dates, afaik, from the days where memory wasn't as plentiful as it is today.

Edit: The compare function could be passed out to another utility (e.g. diff) which may be memory hungry.
I too thought it was not apache problem . Is there anything I could do on the wget side?
 
Old 09-30-2014, 09:04 AM   #4
business_kid
LQ Guru
 
Registered: Jan 2006
Location: Ireland
Distribution: Slackware & Android
Posts: 10,693

Rep: Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187
Yes - figure out the meaning of this message
Code:
   The file is already fully retrieved; nothing to do.
To me it seems to be saying that your wgat was successful. If wget is hitting the max memory while using the -c option, stop using it.
 
Old 10-01-2014, 10:42 AM   #5
TruongAn
Member
 
Registered: Dec 2004
Location: Vietnam (Việt Nam)
Distribution: Gentoo (desktop), Arch linux (laptop)
Posts: 728

Original Poster
Rep: Reputation: 33
Quote:
Originally Posted by business_kid View Post
Yes - figure out the meaning of this message
Code:
   The file is already fully retrieved; nothing to do.
To me it seems to be saying that your wgat was successful. If wget is hitting the max memory while using the -c option, stop using it.
How can I resume an interrupted downloading process without -c option then
 
Old 10-01-2014, 02:50 PM   #6
business_kid
LQ Guru
 
Registered: Jan 2006
Location: Ireland
Distribution: Slackware & Android
Posts: 10,693

Rep: Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187Reputation: 1187
You should have the directory tree there. I would give it 'wget -c url/file' with no other options and process it when you have it down.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] error while compiling llvm : virtual memory exhausted: Cannot allocate memory aashayshah Linux From Scratch 5 03-13-2013 11:04 AM
LXer: Using Wget With FTP To Download/Move Web Sites Recursively LXer Syndicated Linux News 0 10-26-2012 03:41 PM
[SOLVED] Virtual Memory limit exhausted at 1TB? malloc/mmap failures even with free memory mfkraft Linux - Server 2 09-16-2012 09:27 AM
LXer: How to download recursively from an FTP site with Wget LXer Syndicated Linux News 0 02-27-2012 01:50 AM
large download with wget -c michapma Linux - Software 4 07-20-2006 06:12 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 09:33 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration