Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
09-16-2005, 08:16 AM
|
#1
|
Senior Member
Registered: Jan 2003
Location: Aachen
Distribution: Opensuse 11.2 (nice and steady)
Posts: 2,203
Rep:
|
Download web site...... (offline browsing)
Hi i want a programme so as to download a web site for offline browsing.. I have tried wget but this programme downloads the php files without altering them and this makes the offline browsing impossible because all the time konqueror give a prompt of how open the file
Do u have something in mind to suggest me?
|
|
|
09-16-2005, 08:40 AM
|
#2
|
Member
Registered: Mar 2005
Location: UK
Distribution: opensuse 12.2 x86_64
Posts: 563
Rep:
|
You can use the MAF extension in Firefox to download entire pages in a zip, MAF (Mozilla Archive Format) or MHT format. Is that the sort of thing you need?
|
|
|
09-16-2005, 09:00 AM
|
#3
|
Senior Member
Registered: Jul 2004
Location: France
Distribution: Arch Linux
Posts: 1,897
Rep:
|
I don't really understand. Once you are offline, you can't follow links, unless you downloaded the targets too.
So either you download a single page, in which case I find rjwilmsi's advice to be very good...
Or you really want to do offline browsing on a single whole web site (you obviously can't download the whole net) in which case wget is perfect! Just give it the right options and it will transform all links so that they are local.
Yves.
|
|
|
09-16-2005, 10:59 AM
|
#4
|
Senior Member
Registered: Jan 2003
Location: Aachen
Distribution: Opensuse 11.2 (nice and steady)
Posts: 2,203
Original Poster
Rep:
|
Or you really want to do offline browsing on a single whole web site (you obviously can't download the whole net) in which case wget is perfect! Just give it the right options and it will transform all links so that they are local.
Yea i need offline browsing..
I have tried wget -r http://mysite.gr -X http://mysite.gr/forum
but this has only downloaded only some files (lot of pics-banners were skipped) that cant be browsed.. when i click to a link the browser dont know to handle a file with .php extension
|
|
|
09-16-2005, 02:53 PM
|
#5
|
Senior Member
Registered: Jul 2004
Location: France
Distribution: Arch Linux
Posts: 1,897
Rep:
|
You'll probably be interested in those options:
wget -nc -w 1 -x -E -r -l inf -k -p -np
Yves.
|
|
|
09-16-2005, 03:02 PM
|
#6
|
Senior Member
Registered: Jan 2003
Location: Aachen
Distribution: Opensuse 11.2 (nice and steady)
Posts: 2,203
Original Poster
Rep:
|
thx...but how can i use all these?
|
|
|
09-18-2005, 09:48 AM
|
#7
|
Senior Member
Registered: Jul 2004
Location: France
Distribution: Arch Linux
Posts: 1,897
Rep:
|
I got this all from the man page. Usage would probably be like that:
wget -nc -w 1 -x -E -r -l inf -k -p -np -X http://mysite.gr/forum http://mysite.gr
Please report if it works... or not
Yves.
|
|
|
09-18-2005, 10:56 AM
|
#8
|
LQ Newbie
Registered: Sep 2005
Location: Norway
Distribution: PCLinuxOS/Puppy
Posts: 18
Rep:
|
I think you might be looking for something like HTTrack. You'll find it at http://www.httrack.com and it's also freely available in the Debian repositories.
|
|
|
10-05-2005, 02:47 PM
|
#9
|
LQ Newbie
Registered: Dec 2003
Location: Poland/Lodz
Distribution: Kubuntu
Posts: 2
Rep:
|
hello!
I've tried both methods you describe here: wget and httrack, and unfortunatelly wget downloads only text files (htm,txt,css) without images. instead of that httrack works perfectly!
Kris
|
|
|
10-11-2005, 05:39 AM
|
#10
|
Senior Member
Registered: Jan 2003
Location: Aachen
Distribution: Opensuse 11.2 (nice and steady)
Posts: 2,203
Original Poster
Rep:
|
I agree withu u... I think that there is no more any programme for web downloading...
|
|
|
All times are GMT -5. The time now is 11:26 PM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|