I'm trying to have SGE run job array tasks concurrently based on the job shares parameter of qsub but it seems not to be working as expected. Is there a way to enable concurrent task execution based on shares?
I have a script which sleeps to simulate long running tasks and I submit it to a small SGE cluster (26 slots) as different job arrays as follows:
qsub -t 1-201 -js 100 sge_longRunning.sh
qsub -t 1-202 -js 101 sge_longRunning.sh
qsub -t 1-203 -js 102 sge_longRunning.sh
I would expect the tasks to be almost equally distributed on the cluster over time but what I get is that the last submitted array gets fully executed (all 203 tasks), then the second one gets completely executed, finally the first one.
The cluster operates under a functional policy with 1M tickets and 0.9 weight for functional policy tickets.
Any hints how to get the tasks for the different job arrays to run concurrently sharing almost equally the available resources? Any hint what might be wrong with the above configuration/test setting?