LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 10-24-2012, 09:48 PM   #1
zulkifal
LQ Newbie
 
Registered: Jul 2012
Posts: 22

Rep: Reputation: Disabled
wget command doesnt download rpms but only index.html


I am trying to download rpm packages using the following command but not getting anything except index.html
wget -r http://s3.amazonaws.com/public-repo-.../repos/centos6

The actual url is " http://public-repo-1.hortonworks.com/HDP-1.1.1.16/repos/centos6" but that doesnt get anything so i tried visiting the url in a browser and got the first url to show the content.

Thanks for help
 
Old 10-24-2012, 09:51 PM   #2
cbtshare
Member
 
Registered: Jul 2009
Posts: 610

Rep: Reputation: 42
Does wget work with any other URL?
 
Old 10-24-2012, 09:52 PM   #3
zulkifal
LQ Newbie
 
Registered: Jul 2012
Posts: 22

Original Poster
Rep: Reputation: Disabled
yes i tried it for other sites like archive.cloudera.com and many others without any issues.
 
Old 10-24-2012, 10:15 PM   #4
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,331
Blog Entries: 55

Rep: Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529
Quote:
Originally Posted by zulkifal View Post
I am trying to download rpm packages using the following command but not getting anything except index.html
Their documentation clearly states fetching the release package or repo file (http://public-repo-1.hortonworks.com...ntos6/hdp.repo) so why not use that? Saves time and effort.
 
Old 10-24-2012, 10:16 PM   #5
cbtshare
Member
 
Registered: Jul 2009
Posts: 610

Rep: Reputation: 42
http://s3.amazonaws.com/public-repo-.../repos/centos6

is not the correct link, a wget on that link will only get you the index.html page.
 
Old 10-25-2012, 09:12 AM   #6
zulkifal
LQ Newbie
 
Registered: Jul 2012
Posts: 22

Original Poster
Rep: Reputation: Disabled
@unSpawn : I dont have access to internet so i am making a local repository.
@cbtshare: Thats correct but if you click on the site which they have provided in their docs it will go no where so, I am confused.

Thanks
 
Old 10-25-2012, 10:30 AM   #7
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,331
Blog Entries: 55

Rep: Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529Reputation: 3529
Quote:
Originally Posted by zulkifal View Post
I dont have access to internet so i am making a local repository.
If your goal is to get those packages one way or the other and if Yum access is the only interface they provide then I can't see how that would exclude you from installing their repo release package and downloading packages that way.
 
Old 10-25-2012, 11:50 AM   #8
mmheera
Member
 
Registered: Oct 2012
Location: Germany
Distribution: Ubuntu, Debian, Fedora,Windows 7
Posts: 107

Rep: Reputation: 11
Hi Zulkifal, the link you have provided is the Repository Homepage, with the downloads listed (but not any actual download link). Therefore it's clear that you are getting only a index.html page. For an actual download, for example, for "HDP-1.1.1.16-centos6.tar.gz" file, you need to add the exact file name at the end. For example use the following link with wget and see:

http://s3.amazonaws.com/public-repo-...centos6.tar.gz

Thanks!
 
Old 10-25-2012, 12:58 PM   #9
zulkifal
LQ Newbie
 
Registered: Jul 2012
Posts: 22

Original Poster
Rep: Reputation: Disabled
Thanks a lot guys. @mmheera : I understand what you mentioned but if there are 61 files , you cant do this for each and every file unless you spend an hour writing a script for it.
 
Old 10-25-2012, 01:09 PM   #10
mmheera
Member
 
Registered: Oct 2012
Location: Germany
Distribution: Ubuntu, Debian, Fedora,Windows 7
Posts: 107

Rep: Reputation: 11
Well, if you consider using a browser and manually download the files, you would have to do it one by one anyway. I think that's how it is offered there. If the whole repository is offered as a complete .tar.gz file or in any other format, then of course you can download it by clicking it only once. It has nothing to do with wget or the site in question. The point is, wget needs a 'file name' as an argument, at that it will try to download. And it can not just download all files in a particular page at one shot, which you can't do either by one click. Guess I have been able to make things clear.
 
Old 10-25-2012, 01:31 PM   #11
mmheera
Member
 
Registered: Oct 2012
Location: Germany
Distribution: Ubuntu, Debian, Fedora,Windows 7
Posts: 107

Rep: Reputation: 11
Now, here are two options for you to try which are quoted from the following link:

The Ultimate Wget Download Guide With 15 Awesome Examples

Quote:
Download a Full Website Using wget –mirror

Following is the command line which you want to execute when you want to download a full website and made available for local viewing.
$ wget --mirror -p --convert-links -P ./LOCAL-DIR WEBSITE-URL
–mirror : turn on options suitable for mirroring.
-p : download all files that are necessary to properly display a given HTML page.
–convert-links : after the download, convert the links in document for local viewing.
-P ./LOCAL-DIR : save all the files and directories to the specified directo
Quote:
Download Only Certain File Types Using wget -r -A

You can use this under following situations:
Download all images from a website
Download all videos from a website
Download all PDF files from a website

$ wget -r -A.pdf http://url-to-webpage-with-pdfs/
Hope it helps!
 
Old 10-26-2012, 09:33 PM   #12
zulkifal
LQ Newbie
 
Registered: Jul 2012
Posts: 22

Original Poster
Rep: Reputation: Disabled
Thanks alot mmheera, i will give it a try and let you know the result.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] wget -r doesn't download anything except index.html mahkoe Linux - General 4 06-12-2012 05:40 PM
[SOLVED] wget failed to download a html page moebus Linux - General 11 01-31-2012 10:58 PM
[SOLVED] How to use wget to download a html book. errigour Linux - Newbie 3 11-02-2011 08:20 AM
wget command runs manually but over cron doesnt ulver Linux - Newbie 3 12-24-2009 05:26 AM
apache index.html doesn't show up but index.php do zoffmann Linux - Server 5 01-28-2008 04:53 PM


All times are GMT -5. The time now is 06:38 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration