download/save a site- wget
I want to save in my server a couple of documentation sites. I tried using "wget", but it doesn't work. The structure of the site changes after the download and external links - to other sites- cannot be saved.
I use for example "wget -Nkr --tries=8 --no-parent --level=3 --dot-style=binary --cut-dirs=3". Has anybody done this using wget? Perhaps i don't use the right parameters. I've seen some of these programs for windows Any ideas? |
Quote:
I'm not sure about the external links. Do you want to save all the external content that's linked on a website? If so, I wouldn't know if/how you could achieve that. |
All times are GMT -5. The time now is 04:17 PM. |