using wget
Hi,
I'm wanting to use wget, read the manual pages but not very clear. Let's suppose I want to download all textfiles located on: http://www.linuxmanuals.org/ The site is subdivided in http://www.linuxmanuals.org/software http://www.linuxmanuals.org/hardware http://www.linuxmanuals.org/installation http://www.linuxmanuals.org/security ... So on the first page (homepage) there are not really textfiles i'm interested in but in the /software /hardware /installation and /security there are direct links to the textfiles. So if I have figured it out rightly, I need to download all txt files, 1 link deep from http://www.linuxmanuals.org/ by the way is there anyway to save the files it downloads into a specified directory? Now my question is how will my command look like? something like wget -A txt -l 1 http://www.linuxmanuals.org/ or am I completely wrong? Many thx Wannes |
your example seems to work for me, but I'd suggest using -r and l 2
|
-r option
what's the difference between the -A and -r option? the -r seems to work for me and -A not
|
from the manpage for wget:
-A acclist --accept acclist -R rejlist --reject rejlist Specify comma-separated lists of file name suffixes or patterns to accept or reject (@pxref{Types of Files} for more details). -r is recursive!!! |
All times are GMT -5. The time now is 12:52 PM. |