wget command doesnt download rpms but only index.html
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
@unSpawn : I dont have access to internet so i am making a local repository.
@cbtshare: Thats correct but if you click on the site which they have provided in their docs it will go no where so, I am confused.
I dont have access to internet so i am making a local repository.
If your goal is to get those packages one way or the other and if Yum access is the only interface they provide then I can't see how that would exclude you from installing their repo release package and downloading packages that way.
Hi Zulkifal, the link you have provided is the Repository Homepage, with the downloads listed (but not any actual download link). Therefore it's clear that you are getting only a index.html page. For an actual download, for example, for "HDP-184.108.40.206-centos6.tar.gz" file, you need to add the exact file name at the end. For example use the following link with wget and see:
Well, if you consider using a browser and manually download the files, you would have to do it one by one anyway. I think that's how it is offered there. If the whole repository is offered as a complete .tar.gz file or in any other format, then of course you can download it by clicking it only once. It has nothing to do with wget or the site in question. The point is, wget needs a 'file name' as an argument, at that it will try to download. And it can not just download all files in a particular page at one shot, which you can't do either by one click. Guess I have been able to make things clear.
Following is the command line which you want to execute when you want to download a full website and made available for local viewing.
$ wget --mirror -p --convert-links -P ./LOCAL-DIR WEBSITE-URL
–mirror : turn on options suitable for mirroring.
-p : download all files that are necessary to properly display a given HTML page.
–convert-links : after the download, convert the links in document for local viewing.
-P ./LOCAL-DIR : save all the files and directories to the specified directo
Download Only Certain File Types Using wget -r -A
You can use this under following situations:
Download all images from a website
Download all videos from a website
Download all PDF files from a website