is there a linux program that downloads an entire website?
is there a linux program that downloads an entire website?
|
If you are using firefox (and you should be) there is an extension called Scrapbook that lets you snake a website for offline viewing. Keep in mind the importance of respecting people's intellectual property rights when investigating this sort of thing.
:-) E |
wget -r
|
cool!
(although something that downloads an entire website in one click would be way cooler) I'll go check that out. thanks! |
wget is great for this sort of thing. You can also use it to do things like grab all files with a certain extension in a directory on a website, or just mirror an entire site...
man wget :) --Shade |
um...
what does it mean to mirror an entire site? |
To grab an entire site, directory structure and all, and repost it for viewing to alleviate some of the stress from the original server.
|
oh ok thanks for the info
I already tried wget using options -r -k -p on a website and found out there are still other directories it didn't download. What's wrong? |
wget -m is for mirroring
Cool |
I also tried wget -m. how come there are other directories within the website that werent downloaded?
|
Could be a robot.txt file? Sometimes people will set it up to deliberately stop people from recursively downloading every directory on their website... .htaccess...
Cool |
All times are GMT -5. The time now is 10:42 AM. |