Downloading whole directories
Can I use wget to download whole http directories? What about d4x?
|
Sure, you can even use wget to mirror websites....
man wget for all the options and manual pages on it... |
You can use "wget -m" to recursively download a site using page links.
|
So, if I write:
$ wget -m http://domain.com/~user/dir1/ will it get all files within dir1? If there are subdirs will wget get them too? I can't imagine it is so easy.. |
It won;t get all the files since there is no way to get a plain file list sent back from an http server. That is what ftp is for. Using wget with -m will follow the links and download the docuements attached. If you happen to have no index pages and apache generates Indexes then it should copy down all the directories.
|
All times are GMT -5. The time now is 01:07 PM. |