I want to submit an R script myjob.R
that takes two arguments for which I have several scenarios (here only a few as an example).
I want to pass these arguments by looping through scens
and sets
.
In order to avoid overloading the squeue
on the cluster, I don't want to submit the whole loop at once.
Instead I want to wait 1h between each individual job submission.
Therefore, I included the sleep 1h
command, after each iteration.
I used to launch the bash script via bash mybash.sh
, however this command requires to keep the terminal open until all jobs have been submitted.
My solution was then to launch mybash.sh
via sbatch mybash.sh
. This is somehow nesting two sbatch commands. Seems to work very well.
My question is only if there is any reason against submitting nested sbatch
commands.
Thanks!
Here is mybash.sh
script:
#!/bin/bash
scens=('AAA' 'BBB')
sets=('set1' 'set2')
wd=/projects/workdir
for sc in "${!scens[@]}";do
for se in "${!sets[@]}" ;do
echo "SCENARIO: ${scens[sc]} --- SET: ${sets[se]}"
sbatch -t 00:05:00 -J myjob --workdir=${wd} -e myjob.err -o myjob.out R --file=myjob.R --args "${scens[sc]}" "${sets[se]}"
# My solution is to include the following line & run this bash script via sbatch
sleep 1h
done
done