Share your knowledge at the LQ Wiki.
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 06-11-2005, 02:21 AM   #1
Registered: Apr 2004
Posts: 682

Rep: Reputation: Disabled

I been trying to download a file but in the middle of the download it gets disconnected; I want wget to resume downloading from where i last finished.

the current switch i am using is:

wget -np -nc -r ftp://file.tar.gz.

From the manpage I read the I should use the -c switch; but it is not working. wget is just skipping the half-installed file instead of comparing the local copy with the server copy.

wget -np -nc -r -c ftp://file.tar.gz.

any help would be appreciated.
Old 06-11-2005, 05:37 AM   #2
Registered: Jan 2005
Location: India
Distribution: RHEL,CentOS,SUSE,Solaris10
Posts: 183

Rep: Reputation: 31
wget -c http://siteurl

check , not all the site are supporting resumable dowloading, may be for that reson .
Old 06-11-2005, 06:05 PM   #3
Registered: Mar 2005
Distribution: slackware 11, arch 2007.08
Posts: 154

Rep: Reputation: 30
you might also want to put the r and the c together
wget -np -nc -r -c ftp://file.tar.gz.
I don't know if it will matter, but it saves you some typing
Old 06-11-2005, 08:40 PM   #4
LQ Addict
Registered: Jul 2002
Location: East Centra Illinois, USA
Distribution: Debian stable
Posts: 5,908

Rep: Reputation: 355Reputation: 355Reputation: 355Reputation: 355
Did you cd to the directory where the partial file resides? The man pages specifically state that the -c option will resume download if the partial file is in the current directory (meaning the present working directory). If you have wget set up to write the file to a specific directory, and issue the command from another directory, it won't recognize that the partial file exists.
Old 06-14-2005, 04:52 AM   #5
Registered: Apr 2004
Posts: 682

Original Poster
Rep: Reputation: Disabled
Nothing works; I have also tried "wget -np -nc -r -c" it gives the same old message: "file already there, not retreiving"..."permission denied"
Old 06-24-2005, 06:11 AM   #6
Registered: Apr 2004
Posts: 682

Original Poster
Rep: Reputation: Disabled
Old 06-24-2005, 06:31 AM   #7
Registered: Oct 2004
Location: Belgium
Distribution: Slackware 13.37
Posts: 512

Rep: Reputation: 31
If you post your solution here, people will be able to use it if they have the same problem.
Old 06-24-2005, 02:06 PM   #8
LQ Newbie
Registered: Jun 2005
Posts: 26

Rep: Reputation: 15
Originally posted by noir911
How solved?
Old 07-30-2005, 09:57 AM   #9
Registered: Apr 2004
Posts: 682

Original Poster
Rep: Reputation: Disabled
First off, extremely sorry for not posting the solution. My apologies.

Here's the solution:

if you want to check the local folder, compare all the files there and download partially downloaded file then use:

wget --passive-ftp -c -r -np ftp://

The specifics of all the switches can be found in the wget manpage.



Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
can't wget!! aru_04 Linux - General 5 08-13-2005 06:07 AM
wget Harp00 Linux - Newbie 4 11-15-2004 08:27 PM
wget toastermaker Linux - Software 4 11-13-2004 11:59 AM
wget filex Linux - Security 4 09-08-2004 09:02 PM
wget know how ksd Slackware 16 10-19-2003 08:15 AM > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 03:35 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration