Job submission on cluster
I am working on a Linux cluster, writing a CGI to automatically submit job.
But the user 'apache', who is running the CGI does not have the privileges to submit job.
Could anyone suggest me how I can go around that problem?
I'm figuring that you are trying to use the "at" command to submit a job and that you want the job to run on the same node as the process that submits it.
First you have to configure your system to run the "atd" daemon. That is one of the scripts in the /etc/init.d directory. It is probably called atd or at.
Then you have to see if your system has a file called at.deny and possibly another file called at.allow. If those files don't exist then DON'T CREATE THEM. If these exist then they may be located in the /etc directory. If either or both of these files exist then see if there are any entries in either file. These would be user account names. Make sure that the user account that you want to allow use of the at command is not in the at.deny file and is in the at.allow file.
Make sure that the executable file that you want to submit to a queue is given the executable attribute.
Make sure that the file that you want to submit to a queue is not located on a partition that is mounted with the "noexec" attribute.
Report back here what happens and if you are successful.
|All times are GMT -5. The time now is 02:23 PM.|