Reposync fails when trying to download packages from Red Hat
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Distribution: Linux - Red Hat, Suse, Fedora, Ubuntu, Debian, Cent OS, HP UX - 11.x
Posts: 17
Rep:
Reposync fails when trying to download packages from Red Hat
I am RHEL 5 x86_64 server to download the "rhel-x86_64-server-5" repository from Red Hat using reposync and it keeps on failing with errors. I tried it multiple times and updated few packages and then fails again. I have pasted the error I got below. Please help.
Traceback (most recent call last):
File "/usr/bin/reposync", line 291, in ?
main()
File "/usr/bin/reposync", line 268, in main
path = repo.getPackage(pkg)
File "/usr/lib/python2.4/site-packages/yum/yumRepo.py", line 853, in getPackage
cache=cache
File "/usr/lib/yum-plugins/rhnplugin.py", line 311, in _getFile
start, end, copy_local, checkfunc, text, reget, cache)
File "/usr/lib/yum-plugins/rhnplugin.py", line 404, in _noExceptionWrappingGet
timeout=self.timeout
File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 936, in urlgrab
return self._retry(opts, retryfunc, url, filename)
File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 854, in _retry
r = apply(func, (opts,) + args, {})
File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 922, in retryfunc
fo = URLGrabberFileObject(url, filename, opts)
File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 1010, in __init__
self._do_open()
File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 1093, in _do_open
fo, hdr = self._make_request(req, opener)
File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 1202, in _make_request
fo = opener.open(req)
File "/usr/lib64/python2.4/urllib2.py", line 358, in open
response = self._open(req, data)
File "/usr/lib64/python2.4/urllib2.py", line 376, in _open
'_open', req)
File "/usr/lib64/python2.4/urllib2.py", line 337, in _call_chain
result = func(*args)
File "/usr/lib64/python2.4/site-packages/M2Crypto/m2urllib2.py", line 82, in https_open
h.request(req.get_method(), req.get_selector(), req.data, headers)
File "/usr/lib64/python2.4/httplib.py", line 813, in request
if v[0] != 32 or not self.auto_open:
IndexError: tuple index out of range
Distribution: Linux - Red Hat, Suse, Fedora, Ubuntu, Debian, Cent OS, HP UX - 11.x
Posts: 17
Original Poster
Rep:
I have opened a case with redhat, they have no clue either. They suggested that there could be a problem with my proxy server, which I verified and found nothing wrong with it.
Distribution: Linux - Red Hat, Suse, Fedora, Ubuntu, Debian, Cent OS, HP UX - 11.x
Posts: 17
Original Poster
Rep:
Thanks for help, appreciate you time....this is a very peculiar problem, I have done this before when I was maintaining depot servers but I never ran into these issues.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.