Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
09-25-2014, 12:16 PM
|
#1
|
Member
Registered: Dec 2004
Location: Vietnam (Việt Nam)
Distribution: Gentoo (desktop), Arch linux (laptop)
Posts: 728
Rep:
|
wget quit with memory exhausted while download large file recursively
I want to download a whole directory listing by apache on my server. I use "wget -r" for that. The directory contain some very large files so I specified -c switch to resume in case download process is interupted mid way. The exact option I used was:
Code:
wget -c -r -np -nH -R 'index*' --cut-dirs=1
And then come the problem, when I resume downloading, if wget encounter some fully retrieved file, it will hang up and took a lot of memory before proceed to the next file. The larger the file, the more memory it took. And if the file was too large (8GiB in my case) wget stop with a memory exhausted messange. The last output it show me (in --debug mod) was
Code:
---response end---
416 Requested Range Not Satisfiable
The file is already fully retrieved; nothing to do.
Disabling further reuse of socket 3.
Closed fd 3
The server was just an apache web server with default settings. I don't suspect the server was the cause and really has no more ideas where to look at. Please help
Last edited by TruongAn; 09-29-2014 at 11:46 AM.
|
|
|
09-27-2014, 04:11 AM
|
#2
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,200
|
If you ran out of memory, it is the kernel that would have shut your connection down, not apache.
That said, wget making an like that is strange. The code dates, afaik, from the days where memory wasn't as plentiful as it is today.
Edit: The compare function could be passed out to another utility (e.g. diff) which may be memory hungry.
Last edited by business_kid; 09-27-2014 at 04:14 AM.
|
|
|
09-29-2014, 11:48 AM
|
#3
|
Member
Registered: Dec 2004
Location: Vietnam (Việt Nam)
Distribution: Gentoo (desktop), Arch linux (laptop)
Posts: 728
Original Poster
Rep:
|
Quote:
Originally Posted by business_kid
If you ran out of memory, it is the kernel that would have shut your connection down, not apache.
That said, wget making an like that is strange. The code dates, afaik, from the days where memory wasn't as plentiful as it is today.
Edit: The compare function could be passed out to another utility (e.g. diff) which may be memory hungry.
|
I too thought it was not apache problem . Is there anything I could do on the wget side?
|
|
|
09-30-2014, 09:04 AM
|
#4
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,200
|
Yes - figure out the meaning of this message
Code:
The file is already fully retrieved; nothing to do.
To me it seems to be saying that your wgat was successful. If wget is hitting the max memory while using the -c option, stop using it.
|
|
|
10-01-2014, 10:42 AM
|
#5
|
Member
Registered: Dec 2004
Location: Vietnam (Việt Nam)
Distribution: Gentoo (desktop), Arch linux (laptop)
Posts: 728
Original Poster
Rep:
|
Quote:
Originally Posted by business_kid
Yes - figure out the meaning of this message
Code:
The file is already fully retrieved; nothing to do.
To me it seems to be saying that your wgat was successful. If wget is hitting the max memory while using the -c option, stop using it.
|
How can I resume an interrupted downloading process without -c option then
|
|
|
10-01-2014, 02:50 PM
|
#6
|
LQ Guru
Registered: Jan 2006
Location: Ireland
Distribution: Slackware, Slarm64 & Android
Posts: 17,200
|
You should have the directory tree there. I would give it 'wget -c url/file' with no other options and process it when you have it down.
|
|
|
All times are GMT -5. The time now is 07:32 PM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|