Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I am looking for either a script or application which allows me to download all files from Google Cache, as i'd like to archive a site which is currently offline.
Downloading manually by clicking on the Cached webpage is very tedious when doing it manually.
The script must take into account, when you have search hits over multiple pages, so you'd manually click onto next.
I only need to download the "Cached" content returned from Google cache and nothing else.
It should create local directories when necessary and store content there. I mention this because I found a Firefox plugin which claimed to do be able to scoop up content from Google cache, but did'nt create the local directory's nor did it give the correct local filename's.
wget won't handle the multiple page search results. I don't think there is an option with Google to return 1 very loooong page instead of multiple search results else that would be very handy for scripting :-)
100 results per page is the maximum setting.
I don't think that a wget 1-liner will do what I want (pls correct me if you know of a trick)
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.