How to get website offline
I want to view a whole website offline. Not just the home page of it, the whole site so that I can click on the links as well, and view it at the time I had ripped it (doesn't have to be up-to-date offline). How can I do this? I knew how to do this with Internet Explorer but Firefox doesn't have this option in Bookmarks.
|
Well, you'll have to go down to the terminal to do this the fastest way. Look the man page for utility called wget. In short, this is how I'd mirror a web site:
wget -m -k http://www.slackware.com Instead of -m you can use -r -l N, where N is your depth level. |
Can I view the saved pages on Firefox like a website if I were online?
|
If it's a dynamic website make sure you use that option to set the depth..
I once pulled a dynamic site that had a calendar on it.. so if you can tell me when a calendar ends I'll tell you how deep this thing was trying to drill... ugh |
I'm trying to download a DVD store website so I can look at their products offline. How would I go about doing this?
|
Quote:
|
I've been using HTTrack to rip this site.
Where does wget save the pages to? |
If I only want to save a specific part of a website, like all the sites but not everything else in the home page, like www.blahblah.com/onlythis, how can I do that? I think ripping the whole site might take up too much space and time
|
|
All times are GMT -5. The time now is 09:23 PM. |