wget - retrieving one folder of website
Im currently trying hard to do the following:
I need to copy the contents of a websites folder, eg http://website/folder1 The problem Im getting is that wget tries to get all of the files of the server http://website, and also, if there any href to an external site like amazon.com, then wget tries to retrieve amazon as well, what params should I run wget with to stop this behaviour, I mean, Id just like to do something like the imaginary "cp -fr http://website/folder1 . " and just that, no parents retrieved, no external links So far, all Ive found is wget -t0 -m http://website/folder but doesnt work for my purposes Thanks a lot |
wget http://adress/folder/*.* should work well
|
Wildcards not allowed in http...
|
Quote:
Anyway, I use cut dirs to not download the full path and Exclude to not download some directories. For example.... Code:
If you want to exclude one or more directories from the download, you can use the –X option. |
thx a lot
|
All times are GMT -5. The time now is 12:47 PM. |