SGE job stalled using qsub. Works in interactive qrsh. Why?
I am using a software called Mothur to process some DNA data.
So first I qrsh onto a node of a cluster. I write a batch script for Mothur and submit it in the command line like this... Code:
>mothur mothurBatchFile.mt However, when I qsub the job like this... Code:
qsub -cwd submit_mothur.sh Code:
>mothur mothurBatchFile.mt I will be processing 50 such DNA files separately; I would like to submit them all at once with a qsub script rather than open 50 different terminals to process each one in an interactive session. The RAM needed is 214 MB. The file size is 218 MB. Thanks in advance. |
Possibly it is the difference between an interactive shell and a batch shell.
The environment may be different. You might try using qsub job to do a "printenv", and do the same on the qrsh. The difference in the two outputs may point to why one works and one doesn't. One other possiblity, depending on what "job stalls at the same point" means. It is possible the batch queue that supports a default qsub doesn't allow either enough memory, or enough CPU, and the job gets suspended, or aborted when the limits are reached. |
All times are GMT -5. The time now is 02:27 AM. |