I have a simple bash script that I use to submit multiple python scripts to the queue - they need to be completed sequentially. This used to work with qsub, but now that I am running the job in a cluster with slurm, the python scripts run simultaneously instead of sequentially.
I was trying the following:
#!/bin/bash
#
#SBATCH -J scrna-seq-pipeline3
#SBATCH -o scrna-seq-pipeline3.out
#SBATCH -e scrna-seq-pipeline3.err
module load python
python trimming.py -o options
python mapping.py -o options
python quality-filtering.py -o options
python feature-counting.py -o options
Each python script has multiple parameters to individually set, hence why I like to submit them in such way.
Is there an easy solution to submit them using sbatch in a way they would run sequentially, i.e., mapping.py only starts after trimming.py is done?