LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Networking
User Name
Password
Linux - Networking This forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.

Notices


Reply
  Search this Thread
Old 04-18-2019, 07:43 AM   #1
qrange
Senior Member
 
Registered: Jul 2006
Location: Belgrade, Yugoslavia
Distribution: Debian stable/testing, amd64
Posts: 1,061

Rep: Reputation: 47
https and proxy


if i understood correctly, proxy is useless for https site, as every user gets different encrypted data?

more and more sites seem to use it, but is that good option in all cases?

could there be some other method for 'signing' websites, but so they are not encrypted, only verified with some hashes or something?

if not, unicast seems to be 'enforced'. i think multicast has many advantages over it.

thanks.
 
Old 04-19-2019, 10:00 AM   #2
smallpond
Senior Member
 
Registered: Feb 2011
Location: Massachusetts, USA
Distribution: Fedora
Posts: 4,140

Rep: Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263
Squid proxies support HTTPS by decrypting and re-encrypting. Browser must be configured to trust Squid's certificate.

https://wiki.squid-cache.org/Features/SslBump
 
1 members found this post helpful.
Old 04-19-2019, 10:19 AM   #3
qrange
Senior Member
 
Registered: Jul 2006
Location: Belgrade, Yugoslavia
Distribution: Debian stable/testing, amd64
Posts: 1,061

Original Poster
Rep: Reputation: 47
but isn't there a way to add 'digital signature' without encrypting everything?
like in PGP for email, where we can choose whether to encrypt message or just sign/verify it.

because, many sites, such as this forum or news sites and such don't really need full encryption.
they could encrypt only sha512 hashes of all files and send just that small data using ssl?
or encrypt 'upstream' data?

encryption takes some cpu cycles and isn't supported on many old systems.
 
Old 04-19-2019, 10:27 AM   #4
sevendogsbsd
Senior Member
 
Registered: Sep 2017
Distribution: FreeBSD
Posts: 2,252

Rep: Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011
The certificate for a web site is to provide a trust and so the transport can be encrypted. The content being trusted is irrelevant. Some sites use dynamic content and you can't hash or verify dynamic content because it doesn't really exist on disk.
 
Old 04-19-2019, 03:51 PM   #5
scasey
LQ Veteran
 
Registered: Feb 2013
Location: Tucson, AZ, USA
Distribution: CentOS 7.9.2009
Posts: 5,727

Rep: Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211Reputation: 2211
As I understand it, https is about encryption, not about "signing" or "verifying" anything. That's why a self-signed certificate will still encrypt the data...still "work", although most modern browsers issue warnings because the signor (Certificate Authority) is not known. Still, a self-signed cert will encrypt the data for transfer.
 
Old 04-20-2019, 07:16 AM   #6
sevendogsbsd
Senior Member
 
Registered: Sep 2017
Distribution: FreeBSD
Posts: 2,252

Rep: Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011
Quote:
Originally Posted by scasey View Post
As I understand it, https is about encryption, not about "signing" or "verifying" anything. That's why a self-signed certificate will still encrypt the data...still "work", although most modern browsers issue warnings because the signor (Certificate Authority) is not known. Still, a self-signed cert will encrypt the data for transfer.
Exactly.
 
Old 04-21-2019, 01:44 AM   #7
qrange
Senior Member
 
Registered: Jul 2006
Location: Belgrade, Yugoslavia
Distribution: Debian stable/testing, amd64
Posts: 1,061

Original Poster
Rep: Reputation: 47
even when dynamic content is used, most of it probably doesn't change too often.
and the data like images and such that take most bandwidth are usually 'static'.
 
Old 04-21-2019, 10:32 AM   #8
smallpond
Senior Member
 
Registered: Feb 2011
Location: Massachusetts, USA
Distribution: Fedora
Posts: 4,140

Rep: Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263Reputation: 1263
It might be interesting to design a hybrid protocol which encrypts private data but not things like shared images. That would save CPU overhead and allow caching to work. Each request could decide to do either an HTTP GET, or an HTTPS GET depending on the link. It would require the browser to have two connections, one encrypted and one not, but would not require major protocol changes. An eavesdropper could still see what sites you visit, but could not access your private data.
 
Old 04-22-2019, 08:46 AM   #9
sevendogsbsd
Senior Member
 
Registered: Sep 2017
Distribution: FreeBSD
Posts: 2,252

Rep: Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011Reputation: 1011
Quote:
Originally Posted by qrange View Post
even when dynamic content is used, most of it probably doesn't change too often.
and the data like images and such that take most bandwidth are usually 'static'.
True but most dynamic content doesn't actually exist on disk - it is generated on the fly so not sure how to accomplish "signing". Currently the only way to assure the site you are visiting is really the site you intended is the certificate that allows HTTPS. Encrypting web server content only protects against physical theft of the disk where the content is stored. This becomes problematic in a cloud or multi-tenant environment because no one really knows which actual physical disk contains the content.

I get there is a need to establish that the site I am hitting is exactly the site I intended but not sure there is any other way besides the web server certificate that allows HTTPS. Any file level encryption or signing/validating is going to take a great deal more resources that a simple HTTPS connection.

HTTPS is dead easy to implement and does not take very may resources.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
squid reverse proxy through https, no cache, no public proxy? alanford Linux - Software 1 02-08-2016 10:55 AM
https certication error , the page can not be loaded via https ust Linux - Server 2 11-21-2013 08:49 PM
Should/Shoudn't proxy https while transparent proxy roopakl Linux - Newbie 1 03-12-2012 09:33 AM
redirect https://www.domain.com to https://domain.com decenter Linux - Server 4 09-13-2011 10:05 AM
apache 2.0 https to https redirect struct Linux - Software 1 04-22-2011 05:43 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Networking

All times are GMT -5. The time now is 05:00 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration