File permissions on an Ubuntu Server.
Hey all,
This is a weird problem I'm facing on my Ubuntu 8.04 server configured on my network. It seems like the file permissions are not working. Every user on the machine is being given access to all the files. For example, I created an 'Unprivileged' user and I logged in via ssh. I could cat all the files contained under /var/www/ and /etc/ which is really unwanted. Why is this happening? Whats wrong here? What can be done to overcome this issue? |
Show ls -l output on the files. Show the user id of the running user (id command).
|
Code:
asheshambasta@india:~$ id Code:
asheshambasta@india:/$ ls -l Code:
asheshambasta@india:/$ ls -l /etc/apache2/ Another question, if I do chmod -R a+rwX /someDirectory, does it change the permissions recursively of the contents within the directory being operated on, or recursively outside? I've always believed the former to be true. |
Another interesting observation: using sftp, I can only browse the restricted folders, eg. /etc/apache2/ but cannot display the file contents. However, I can ssh to the same machine under the same non administrative username, and cat the files and see them.
|
Can anyone please respond? I need to get this server back up soon.
|
Ok. I now see that this is common to all Ubuntu systems, my laptop behaved the same way. So, I'd like to create a new group for the network users who have to ssh to this machine, and then deny that group from accessing system directories.
Anyone knows how to do that? |
There is no security risk in anyone being able to read standard system files.
Only files that contain confidential information, or user-personal files, require tighter security. |
Quote:
I found out that my laptop also allowed such access even by unprivileged users. My laptop is also running Ubuntu 8.04. |
Man check what user your apache server is running and chown the web files with that user and then chmod go-rwx /you/web/dir/, then basic users in the system cant do anything, but if they already know the directory structure they still could run a php script and run something like readfile("/i/know/where/the/file/is"). Also you could implement suPHP so that might help with the security.
Also a lot of files in Debian are visible to the normal users. In CentOS its a bit different, even on Suse, for example on Suse normal users cant even run crontab (not talking about openSuse, I am talking about SLES). |
Quote:
the way to get around the phpmyadmin problem is to enable the cookie authentication in phpmyadmin. This is safer anyway, becuase it means that if your web server was compromised, they dont have access to your SQL user info. Denying access is a problem, because you need to ensure that your apache server still has sufficient access privileges to run your sites |
Quote:
The .htaccess and .htpasswd files would fall under what I termed "confidential information, or user-personal files". |
Hi, algogeek -
Files like .htaccess should definitely be "chmod 644". Universal read access is OK (arguably necessary); the important thing is to restrict web access (e.g. http and/or ftp access). Here are a couple of links: http://www.petefreitag.com/item/505.cfm http://www.itc.virginia.edu/unixsys/sec/ This site has a lot of great examples: http://perishablepress.com/press/200...access-tricks/ And finally, this site has some good background info you might find useful: http://www.psychocats.net/ubuntu/permissions 'Hope that helps .. PSM |
that is not a problem on ur ubuntu
What you're experiencing is very ordinary !
look at ur post with "ls -l" all file have read permission set for everyone ! thats why u can read all files by "cat" u can not edit these files though ! if u want the files unaccessable by other users u should remove the read permission off the desired files/folders which means u wont use chown, u will have to use chmod |
All times are GMT -5. The time now is 01:14 PM. |