https and proxy
if i understood correctly, proxy is useless for https site, as every user gets different encrypted data?
more and more sites seem to use it, but is that good option in all cases? could there be some other method for 'signing' websites, but so they are not encrypted, only verified with some hashes or something? if not, unicast seems to be 'enforced'. i think multicast has many advantages over it. thanks. |
Squid proxies support HTTPS by decrypting and re-encrypting. Browser must be configured to trust Squid's certificate.
https://wiki.squid-cache.org/Features/SslBump |
but isn't there a way to add 'digital signature' without encrypting everything?
like in PGP for email, where we can choose whether to encrypt message or just sign/verify it. because, many sites, such as this forum or news sites and such don't really need full encryption. they could encrypt only sha512 hashes of all files and send just that small data using ssl? or encrypt 'upstream' data? encryption takes some cpu cycles and isn't supported on many old systems. |
The certificate for a web site is to provide a trust and so the transport can be encrypted. The content being trusted is irrelevant. Some sites use dynamic content and you can't hash or verify dynamic content because it doesn't really exist on disk.
|
As I understand it, https is about encryption, not about "signing" or "verifying" anything. That's why a self-signed certificate will still encrypt the data...still "work", although most modern browsers issue warnings because the signor (Certificate Authority) is not known. Still, a self-signed cert will encrypt the data for transfer.
|
Quote:
|
even when dynamic content is used, most of it probably doesn't change too often.
and the data like images and such that take most bandwidth are usually 'static'. |
It might be interesting to design a hybrid protocol which encrypts private data but not things like shared images. That would save CPU overhead and allow caching to work. Each request could decide to do either an HTTP GET, or an HTTPS GET depending on the link. It would require the browser to have two connections, one encrypted and one not, but would not require major protocol changes. An eavesdropper could still see what sites you visit, but could not access your private data.
|
Quote:
I get there is a need to establish that the site I am hitting is exactly the site I intended but not sure there is any other way besides the web server certificate that allows HTTPS. Any file level encryption or signing/validating is going to take a great deal more resources that a simple HTTPS connection. HTTPS is dead easy to implement and does not take very may resources. |
All times are GMT -5. The time now is 12:44 PM. |