Hi,
I'm wanting to use wget, read the manual pages but not very clear.
Let's suppose I want to download all textfiles located on:
http://www.linuxmanuals.org/
The site is subdivided in
http://www.linuxmanuals.org/software
http://www.linuxmanuals.org/hardware
http://www.linuxmanuals.org/installation
http://www.linuxmanuals.org/security
...
So on the first page (homepage) there are not really textfiles i'm interested in but in the /software /hardware /installation and /security there are direct links to the textfiles.
So if I have figured it out rightly, I need to download all txt files, 1 link deep from
http://www.linuxmanuals.org/
by the way is there anyway to save the files it downloads into a specified directory?
Now my question is how will my command look like? something like
wget -A txt -l 1
http://www.linuxmanuals.org/
or am I completely wrong?
Many thx
Wannes