Publishing intranet sites securely on the internet
DebianThis forum is for the discussion of Debian Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Publishing intranet sites securely on the internet
Hello all,
We have about 6 webservices that we use at our company on the intranet only. Now we would like to make them accessible over the internet using SSL. I know how to setup the certificates and configure the webservers (thanks to the HowTo Secure Websites using SSL and certificates ). What I'd like to know is if there is a way to provide this access through a sort of 'web-relay' server that I put in the DMZ of our company instead of providing direct access to the different servers through our firewall. All this to obtain the highest security level.
Hmm you might be able to do this by having the system in the DMZ connect via FTP inside a PHP script at regular intervals and download an updated version of the site...? Or you can do it the other way around, if the DMZ'ed system is low-spec or doesn't have PHP on it - i. e. have a secured server "push" the files to it via FTP whenever the sercured server "wants" to.
I. e. external browsers hitting the DMZ'ed server will see the files, but have no access to the actual source files, or "know" where they come from.
I do something similar with one of my sites, every 24 hours a cronjob runs that calls a bash script and after that a PHP script. The bash script calls tar and 7zip to create a backup of the site, and then the PHP script is called. This connects via FTP to my backup server and transmits the .7z files containing the website files and graphics to the backup server. Any web language or even binary program that can make an FTP connection and transmit files should suffice, you don't neccessarily need CLI PHP even.
I. e. you might be able to do exactly the same using PHP and BASH scripting. The DMZ'ed server doesn't even need to have PHP installed, all it'll need will be a working and correctly configured FTP server. You can then setup one of the secured servers to transmit, via FTP, whatever files you need to the targeted hosting server.
This will also have the benefit of not loading any of your intranet-serving servers with external hosting loads, prevent any form of DOS attack etc...
Use squid proxy to allow access from the Internet to your intranet. Squid should be configured to sanitize requests and only allow access to specific URIs on your internal servers to prevent abuse and unauthorized access.
Use squid proxy to allow access from the Internet to your intranet. Squid should be configured to sanitize requests and only allow access to specific URIs on your internal servers to prevent abuse and unauthorized access.
Thanks for your reply. I've been reading a bit about Squid Proxy and at first hand it seems very interesting. If I have it correctly I would need the reverse proxy configuration since traffic will be coming from the internet. The only thing that puzzles/worries me a bit is that dynamic content is not supported in this setup or does that only apply to the caching option?
dynamic vs static content shouldnt matter - it's all static from the point of view of the user's browser... the user requests a url from the proxy server, the proxy server passes it along and gets whatever is rendered by the server, and then (now static content) is returned to the user for rendering in their browser.
caching shouldn't cause any problems at the client side either as long as content expiration is reasonable and accurate.
dynamic vs static content shouldnt matter - it's all static from the point of view of the user's browser... the user requests a url from the proxy server, the proxy server passes it along and gets whatever is rendered by the server, and then (now static content) is returned to the user for rendering in their browser.
caching shouldn't cause any problems at the client side either as long as content expiration is reasonable and accurate.
ok, sounds logical . At this moment I'm looking into the whole SSL story, regarding certificates and so on. Currently I have one https site, one http, and one Tomcat application that I'd like to migrate. If I change them all to use https would that be possible using Squid, i.e. can Squid be configured to use more than one certificate? Or do certificates get handled between the client and the final server?
I know this post is a bit older but I wrote a post on using nginx as a reverse proxy to do what your wanting to do. All that you would need to add is ssl to the nginx config. (you dont need ssl over your lan unless you really want it.)
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.