LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 12-24-2005, 07:30 AM   #1
gintaras46
LQ Newbie
 
Registered: Dec 2005
Posts: 7

Rep: Reputation: 0
wget re-downloads all files grater than 10M


Greetings for all,

I've get working wget in mirroring mode. The only thing I can't find in manuals is why this wget (used with -m option) redownloads every file grater than 10M even if remote and local timestamps are the same. Any suggestions?
 
Old 12-24-2005, 07:42 AM   #2
linmix
Senior Member
 
Registered: Jun 2004
Location: Spain
Distribution: FC5
Posts: 1,993
Blog Entries: 1

Rep: Reputation: 46
Not sure, but maybe you should use the -c option as well.
 
Old 12-24-2005, 07:53 AM   #3
trickykid
LQ Guru
 
Registered: Jan 2001
Posts: 24,149

Rep: Reputation: 269Reputation: 269Reputation: 269
Quote:
Originally Posted by linmix
Not sure, but maybe you should use the -c option as well.
And or give us the exact command your using for more clues....
 
Old 12-24-2005, 08:05 AM   #4
gintaras46
LQ Newbie
 
Registered: Dec 2005
Posts: 7

Original Poster
Rep: Reputation: 0
If I understand good, -c option is used when one needs to continue partialy dowloaded file to download. My problem is that these files are already downloaded (mirrored), but when the new session begins (once a week) these are downloaded second, third .... nth times again and again, even if they were not changed (size and timestamp are equal localy and remotely). Here is some output from wget:

bash$: wget -m userwd@ftp://myserver.mydomain.com --prefix=/somedirectory

(... connection and all other stuf)

now, files that are smaller than 10M, just passing trough if they were not changed:

Remote file no newer than local file '/path/to/file' --not retrieving.

The sizes do not match (local 22797962) -- retrieving.

--15:55:25-- ftp://myserver.mydomain.com/myfile.any[/url] => /path/to/file/myfile.any

==> CWD /path/to/directory ... done
==> PASV ... done
==> RETR myfile.any ... done

Lenght: 22,797,962 (22M) (unauthoritative) <I have root privileges to this dir and all the files under it. And so goes with every file larger than 10M>

(Please pay attention to the sizes of the file - they are tha same. The time stamp is also the same). Maybe progress bar is somehow affected in this (I now thats a dummy question, but I'm confused already)? As I know, there each "=" represents some amount of bytes. But I can not to believe this.

Last edited by gintaras46; 12-24-2005 at 08:25 AM.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
I want to download ftp-site files via wget and socks5 proxy server. jiawj Red Hat 2 10-28-2004 03:32 PM
wget and delete old files cgoerner Linux - Software 2 09-11-2004 10:19 AM
wget download all files of certain type GT_Onizuka Linux - Software 1 05-10-2004 08:33 PM
Downloads? Sophic Linux - General 4 03-29-2004 05:20 PM
Downloads Fried Linux - Newbie 1 09-01-2001 11:32 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 04:45 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration