-   Linux - Server (
-   -   Chrooting & Apache (

danielakkerman 11-11-2011 05:07 AM

Chrooting & Apache
Hello everyone!
I've recently set up a web server, for users to upload files onto and
run .php and CGI/Perl scripts, namely, a regular webhosting service.
Now the problem that I've encountered, is creating a chrooted jail for each user, to prevent their creating scripts that could compromise the server, or even course through(I've got nothing to hide, but still...).
The site runs on, and each user root is at$username;
Normally, if the document root were seated directly on the root of the virtual host, I'd have no difficulty in creating a jail like that; but my configuration requires the use of a subdirectory for each user, in part for avoiding the creation of a Vhost for each user.
In this case, I am entirely lost!
Is such a thing even feasible?
How can it be done?
Very thankful for your attention,

vickyk 11-11-2011 09:46 AM

Have you tried installing/configuring a FTP server ? You can create users and point them to the$username path in the ftp configuration file.

bathory 11-11-2011 10:21 AM


You can use mod_rewrite for this. Take a look at the example here


danielakkerman 11-11-2011 11:42 AM

Thanks for your replies, a few more clarifications needed...
Thank you both for truly helpful suggestions.
@vicky, I am afraid an FTP server is out of the question at the moment, as I don't have enough external IPs (yet), and have already exploited those that I do have to the maximum. I would very much like to bind a future FTP host on a separate IP, to avoid misuse.
@Bathory, your advice works wonders, but executing abs_path in Perl still offers the direct path, i.e /var/www/.../$username; Meaning that a non-benevolent user might manipulate this path and scroll or "cd" wherever he likes.
So this still persists... sadly :(.
What else can be done?
Thanks again,

bathory 11-12-2011 04:27 AM


For php you can use safe mode, but for perl I don't have any idea, sorry.

You might take a look at apache mod_security


danielakkerman 11-12-2011 07:15 AM

Still, a "no-go"...
First of all guys, I wanted to thank you for all your tremendous aid, it's been absolutely invaluable!
@bathory, I've looked into it, but I couldn't get any headway with mod_security, not being familiar enough with the module, I guess; If you could provide me with a means to start, i.e, what in my case, my condition for denying a request should be, and how would it incorporate the physical path in it, I'd really appreciate it!
How do I go forward?

bathory 11-12-2011 10:21 AM


Have a look at this generic howto. It's written for debian, but it fits all distros.
Or visit this site and fine the rule(s) that match your situation


danielakkerman 11-14-2011 12:34 PM

A useful approach, but...
Hi there!
Firstly, let me thank you for a very handy tool you've granted me. I've added quite a few security arrangements via "mod_security", and it works like a charm, thanks!
However, it is impractical in the broder sense of my query, because it can't read the code of an executing CGI script; so if anyone chooses to

openfile(..., "/etc/passwd")
they'll still be able to do it, regardless of my attempts to block it via Apache.
So I've settled with setting new permissions, in the form of 740/750, and chowning each Cgi script to the respective user. Then, via Suexec(in Apache), running them as such.
I've also thought of creating a chrooted jail for the entire HTTP server, but that seems like quite a hassle.
So what do you think, is this secure enough?
How do other hosting servers get by?
Very grateful for you help,

bathory 11-14-2011 01:58 PM


As I've already told you, I have no idea about locking such perl scripts. Maybe taint mode should do, but you have to test every script and see how it does.

If you are that much concerned about security, you should use a RHEL-based distro and use the SELinux security access control. Also debian based distros have apparmor, that does the same thing.


All times are GMT -5. The time now is 06:28 AM.