I know there are posts on copying entire websites with Wget (Linux), but how do I copy a particular directory only. For example, if I'm at a website that offers free eBooks and the url looked like this:
(I made this url up)
How can I tell Wget to:
1) Copy only what is in the directory 'romance'
2) Copy the HTML files first
3) specify which filetypes to include in the download.
I have a few other questions:
1) Is it possible to increase the timeout?
2) Is it possible to set an option for resume?
Thanks, and sorry if this is a repeat. If it is, perhaps you could point to the url covering the topic. I did a search in the newbie section before posting this.
Please don't suggest Webhttrack...thanks.