using wget to authenticate a SSL web page
I have a web page that I had a script run at midnight and log on to and extract information to put into a database.
The maintainers of the page have upgraded and (good for them) made it more secure with SSL. However, now I'm trying to rewrite my script, and am unable to get past the SSL. The closest I get is a message that it cannot verify the certificate, so I know it's trying. Code:
wget --secure-protocol=SSLv2 --http-user=USERNAME --http-password=PASSWORD https://usageinfo.website.com --no-check-certificate Code:
https://usageinfo.website.com/ |
First I would put --no-check-certificate before, but I don't think it's the problem.
- Are you sure they didn't change anything else on the server? - What is the http code returned ? - Maybe you need to use the Basic authentication (encode a "login:password" in base64) |
Quote:
Quote:
Quote:
Quote:
It appears as though I'm unable to verify the Verisign CA certificate. There's options in wget to specify a file or directory to check certificates, but searching online, I've not found how to obtain such files that I could reference. |
For the certificate try curl
|
Quote:
Code:
curl -uUSERNAME:PASSWORD https://web.domain.com -1 --cert-type DER --cacert /etc/ssl/certs/website.cer Thanks for your suggestions, it's keeping me motivated. |
Update: I believe I've got the proper certificate.
I used openssl to obtain the cert, and then ran through openssl again to verify, recieving an OK status. So, now it appears as though my --post-data "UserName=user&Password=password" does not work, and so I'm thinking it's because now it's using requiring encryption, so perhaps a different approach? I'm still trying with both wget & curl to see which one wins. |
All times are GMT -5. The time now is 08:42 AM. |