Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
11-29-2004, 07:35 PM
|
#1
|
LQ Guru
Registered: Oct 2004
Distribution: Arch
Posts: 5,415
|
Capturing a web site complete
What is the name of a Linux app. that will download an entire web site complete. The immages, links, video-audio streams, text ect. It would be nice if it had the capability to specify the depth of pages from the page that your on to download.
Thanks.
|
|
|
11-29-2004, 08:21 PM
|
#2
|
Senior Member
Registered: May 2004
Location: Belgium
Distribution: Debian, Slackware, Fedora
Posts: 1,465
|
Try an illegal warez forum.
|
|
|
11-29-2004, 08:58 PM
|
#3
|
LQ Guru
Registered: Oct 2004
Distribution: Arch
Posts: 5,415
Original Poster
|
No, not for pirating other peoples work or for getting mp3's from streams, or such.
There are lots of windoze apps for downloading web pages 3 -4 pages deep from a sites home page for offlne viewing. It is good for a person with limited internet access. You can download the pages or whole site and then look at the pages offline. If you have the pages stored on the hard drive then you can browse the site with the links offline. It is much quicker then opening ever page and then getting a copy of it out of your cache. Plus you can see or hear the streams on the page also.
|
|
|
11-29-2004, 09:02 PM
|
#4
|
Member
Registered: Jun 2004
Location: Florida
Distribution: Gentoo
Posts: 148
Rep:
|
Try using:
Or variations of it. Don't forget to read the man page for wget. 
|
|
|
11-29-2004, 09:10 PM
|
#5
|
LQ Guru
Registered: Jan 2003
Location: Seymour, Indiana
Distribution: Distribution: RHEL 5 with Pieces of this and that.
Kernel 2.6.23.1, KDE 3.5.8 and KDE 4.0 beta, Plu
Posts: 5,700
Rep:
|
You can also try Webhttrack
Brian1
|
|
|
11-29-2004, 09:36 PM
|
#6
|
LQ Guru
Registered: Oct 2004
Distribution: Arch
Posts: 5,415
Original Poster
|
Thanks a lot for the answers.
|
|
|
All times are GMT -5. The time now is 01:56 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|