Websnake for Linux?
Hi... does anyone know if there is a way to "snake" websites from within Linux?
I have a public website that I want to move to my Linux box at home. Any opinions on what the easisest way to gather an entire website in Linux might be? -KevinJ |
If you mean snake as in leech, rip or DL then wget would be my CLI tool of choice. Search Freshmeat for wget or Google around and you get some alternatives like suck as well.
|
Nice! I'd never bothered to read the man page for wget:
wget -xr website.com will download recursively all the files on that site, make the directories as it goes so that the end product should be identical to the one on the web. Sweet! I think I have found my new favourite toy! |
All times are GMT -5. The time now is 04:03 PM. |