I have to run multiple sbatch slurm scripts for cluster. Say, I have 50 sbatch files and I am running them sequentially in terminal (am using Ubundu) as follows:
sbatch file1.sbatch
sbatch file2.sbatch
.
.
.
sbatch file50.sbatch
I want to simplify this 50 different commands to run in single command. As I am new working with terminal as well as cluster, I really don't know how to approach this problem. Kindly, suggest me some solution to perform this action (I guess that I need to use some for loop statements, but of which syntax is my doubt). I am completely confused, some relevant documents might also be helpful.
Thank you.
Update: I tried the following script:
#!/bin/bash
for i in {1..3}
do
sbatch layer$i.sbatch
done
But, it didn't create as separate jobs. Only single job is submitted as whole. So, this didn't worked for me.
$ ~/Marabou% sbatch call_sbatch.sbatch
Submitted batch job 4576049
Thanks.
Update:
Following script works:
import os
os.system ("sbatch filename1.sbatch")
os.system ("sbatch filename2.sbatch")