LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 07-04-2012, 12:38 PM   #1
AKviking
LQ Newbie
 
Registered: Jun 2009
Location: Palmer, AK
Distribution: Ubuntu, Debian
Posts: 21

Rep: Reputation: 0
Question using wget to authenticate a SSL web page


I have a web page that I had a script run at midnight and log on to and extract information to put into a database.

The maintainers of the page have upgraded and (good for them) made it more secure with SSL. However, now I'm trying to rewrite my script, and am unable to get past the SSL. The closest I get is a message that it cannot verify the certificate, so I know it's trying.


Code:
wget --secure-protocol=SSLv2 --http-user=USERNAME --http-password=PASSWORD https://usageinfo.website.com --no-check-certificate
This will return me to the logon screen, instead of logging on. If I remove the "--no-check-certificate" option, then I get:

Code:
https://usageinfo.website.com/
Resolving usageinfo.website.com... xxx.xxx.xxx.xxx
Connecting to usageinfo.website.com|xxx.xxx.xxx.xxx|:443... connected.
ERROR: cannot verify usageinfo.website.com's certificate, issued by â/C=US/O=VeriSign, Inc./OU=VeriSign Trust Network/OU=Terms of use at https://www.verisign.com/rpa (c)10/CN=VeriSign Class 3 Secure Server CA - G3â
                Unable to locally verify the issuer's authority.
To connect to usageinfo.website.com insecurely, use "--no-check-certificate".
 
Old 07-04-2012, 10:04 PM   #2
MisterBark
Member
 
Registered: Jul 2012
Location: Republic of Krakozhia
Distribution: Slackware & Zenwalk core + compile
Posts: 104

Rep: Reputation: 6
First I would put --no-check-certificate before, but I don't think it's the problem.

- Are you sure they didn't change anything else on the server?
- What is the http code returned ?
- Maybe you need to use the Basic authentication (encode a "loginassword" in base64)

Last edited by MisterBark; 07-04-2012 at 10:07 PM.
 
Old 07-05-2012, 10:29 AM   #3
AKviking
LQ Newbie
 
Registered: Jun 2009
Location: Palmer, AK
Distribution: Ubuntu, Debian
Posts: 21

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by MisterBark View Post
First I would put --no-check-certificate before, but I don't think it's the problem.
Tried that, but no change.

Quote:
- Are you sure they didn't change anything else on the server?
I have no idea, or know how to find out

Quote:
- What is the http code returned ?
302. When it fails on login, it just returns the main page, so it'll always be 302.

Quote:
- Maybe you need to use the Basic authentication (encode a "login : password" in base64)
That did not work.

It appears as though I'm unable to verify the Verisign CA certificate. There's options in wget to specify a file or directory to check certificates, but searching online, I've not found how to obtain such files that I could reference.
 
Old 07-05-2012, 10:48 AM   #4
MisterBark
Member
 
Registered: Jul 2012
Location: Republic of Krakozhia
Distribution: Slackware & Zenwalk core + compile
Posts: 104

Rep: Reputation: 6
For the certificate try curl
 
Old 07-06-2012, 10:27 AM   #5
AKviking
LQ Newbie
 
Registered: Jun 2009
Location: Palmer, AK
Distribution: Ubuntu, Debian
Posts: 21

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by MisterBark View Post
For the certificate try curl
I've gone down that road as well. I suspect I may be missing something, but I get similar results as with wget.

Code:
curl -uUSERNAME:PASSWORD https://web.domain.com -1 --cert-type DER --cacert /etc/ssl/certs/website.cer

curl: (77) error setting certificate verify locations:
  CAfile: /etc/ssl/certs/website.cer
  CApath: /etc/ssl/certs
I've saved the cert from my web browser, but am unsure if its format is correct. Still playing with that for a bit.

Thanks for your suggestions, it's keeping me motivated.
 
Old 07-08-2012, 01:50 AM   #6
AKviking
LQ Newbie
 
Registered: Jun 2009
Location: Palmer, AK
Distribution: Ubuntu, Debian
Posts: 21

Original Poster
Rep: Reputation: 0
Update: I believe I've got the proper certificate.

I used openssl to obtain the cert, and then ran through openssl again to verify, recieving an OK status.

So, now it appears as though my --post-data "UserName=user&Password=password" does not work, and so I'm thinking it's because now it's using requiring encryption, so perhaps a different approach?

I'm still trying with both wget & curl to see which one wins.
 
  


Reply

Tags
https, secure, ssl, web, wget



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] wget -r opt destroys web page layout? riller Linux - Software 2 11-15-2011 06:02 AM
Authenticate web page using domain credentials harshaabba Linux - Security 1 10-12-2010 03:37 PM
Wget or cURL code for checking changes to a web page? ewingtux Programming 2 12-16-2008 04:46 PM
wget and links2 can't access web page. fakie_flip Programming 6 01-11-2008 04:34 PM
authenticate a web filter in the terminal? (for use of yum, wget, etc) unknown_mosquito Linux - Security 2 05-02-2006 08:51 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 11:56 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration