script needs su commands but is run by user
My system is as follows:-
System = SunOS Node = pinn810-1 Release = 5.10 KernelID = Generic_118855-36 Machine = i86pc I need to write a script that runs as a cron job but runs commands that require su privilages. OR that any (non su) user logging on can open a terminal shell and run the script. The script changing privilages on the fly to do the task. EG: If I have files ending with a certain extension in the login home directory, I would like any user logging on, to run a script that would delete the files. The script would have to be executable by the user but have 'su' rights to delete the files. I know I can use the chmod command to set file privilages but I believe the script needs to change ownership to a 'su', do the job, then change back again to the user. Thanks and regards |
Hi,
In general, if you have a setup where files are being left in a user's home directory with permissions such that the user is unable to delete them, the problem is with the files being created with the wrong permissions in the first place. However, if you want to just work around the problem as you described then what you want, I believe, is "sudo". (http://www.gratisoft.us/sudo/) Sudo is a standard part of Linux distros nowadays but also supported on Solaris,SunOS etc. It gives you very fine control over who can run privileged processes, what they can run and if they are authenticated by password. The sudo documentation should explain the setup better than I can, so I won't go into that here. Hope that helps. TIM |
It doesn't matter what the permissions are on files in a user's home directory. If the user owns his home directory, and is able to write to the directory, he is able to delete ANY files in it no matter who owns them.
Try this: Code:
% ls -la asdf.txt Forrest |
Quote:
|
Thanks for your replies. You are right about the owner being able to delete the files. The tar files are generated when the user restores archived data from an ext RAID. The application running then converts the file into useful info for analysis. The issue has been that these files never get deleted and this has on occasion pushed the home folder to 100%. major issues. I did some further testing yesterday and think I have it beaten. My issue has been that I haven't been putting the full path in a rm command within a cron job script. I did this manually and it worked. So my cron job will run monday night. So I will check on Tuesday morning. Hope this will be the end of it.
|
All times are GMT -5. The time now is 05:14 AM. |