LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 03-25-2009, 07:41 PM   #1
JosephS
Member
 
Registered: Jun 2007
Distribution: Debian Jessie, Bunsenlabs
Posts: 586

Rep: Reputation: 38
Want to download web pages so I can open offline


Use Firefox 3.0.7
Use GNU Wget 1.11.4
I have a question about downloading web pages. If I download with web page complete will I be able to open the pages without being on line, or will there be some pages that I will still need to log in for.

If a web browser is not sufficient, is there some command I can use with Wget to accomplish this?

Thanks.
 
Old 03-25-2009, 07:54 PM   #2
lurid
LQ Newbie
 
Registered: May 2007
Location: Pennsylvania, USA
Distribution: Ubuntu Studio 9.10
Posts: 29

Rep: Reputation: 15
If all you want is a page or two, then saving the page as "Web page, Complete" will work fine. You'll have all the HTML and images and such, making the page look (almost) exactly as it did online. The links on the webpage will still point to their original targets, meaning the online pages. So if you download two pages that are supposed to link together, you'll have to manually change the HTML of the links.

If you're looking to download a whole lot of pages from one site or something like that, you'll need something called an "offline reader". I'm really unfamilar with them, so I can't really offer much advice.

Hope that helps!
 
Old 03-25-2009, 08:22 PM   #3
frieza
Senior Member
 
Registered: Feb 2002
Location: harvard, il
Distribution: Ubuntu 11.4,DD-WRT micro plus ssh,lfs-6.6,Fedora 15,Fedora 16
Posts: 3,233

Rep: Reputation: 406Reputation: 406Reputation: 406Reputation: 406Reputation: 406
if you don't mind command line tools then wget and cURL are options
 
Old 03-25-2009, 08:54 PM   #4
pixellany
LQ Veteran
 
Registered: Nov 2005
Location: Annapolis, MD
Distribution: Mint
Posts: 17,809

Rep: Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743Reputation: 743
It depends on the web page.....if it is going out to different sites to get things, then it will still need a connection to do that. I am sure there are some sites where it would be essentially impossible to get everything downloaded so the you would not need a connection.

Read the manual on wget--it has many options, most of which I have not even begun to tackle.
 
Old 03-25-2009, 10:25 PM   #5
grizly
Member
 
Registered: Nov 2006
Location: Melbourne Australia
Distribution: Centos, RHEL, Debian, Ubuntu, Mint
Posts: 128

Rep: Reputation: 16
oh, wget is the tool, and a fun one! (many a gig of bandwidth wasted while learning that gem!)


Code:
wget --mirror --convert-links --html-extension http://www.gnu.org/
Where http://www.gnu.org/ is the URL of the website. It will download the site (the whole site, so watch it) into the current directory, named: www.gnu.org/

Read that manual though, because there are literally thousands of possible configurations with all the options available!
http://www.gnu.org/software/wget/manual/wget.html
 
Old 03-26-2009, 11:18 AM   #6
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
One of the coolest scripts I've used is curlmirror, which mirrors a website using curl, and is written in perl:

http://curl.haxx.se/programs/curlmirror.txt
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
why is mozzilla firefox on linux slow to open a saved web page when offline. newbiedb Linux - Newbie 2 10-18-2008 11:02 AM
open php web pages dyool Ubuntu 3 08-19-2006 06:59 AM
How to download web pages from a website using wget command Fond_of_Opensource Linux - Newbie 5 07-05-2006 09:50 AM
Download web site...... (offline browsing) alaios Linux - Software 9 10-11-2005 05:39 AM
ADSL Router Web configuration pages appears instead of Personal Web Server Pages procyon Linux - Networking 4 12-20-2004 05:44 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 07:07 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration