1

I have a simple bash script that I use to submit multiple python scripts to the queue - they need to be completed sequentially. This used to work with qsub, but now that I am running the job in a cluster with slurm, the python scripts run simultaneously instead of sequentially.

I was trying the following:

#!/bin/bash 
#
#SBATCH -J scrna-seq-pipeline3
#SBATCH -o scrna-seq-pipeline3.out
#SBATCH -e scrna-seq-pipeline3.err

module load python

python trimming.py -o options
python mapping.py -o options
python quality-filtering.py -o options
python feature-counting.py -o options

Each python script has multiple parameters to individually set, hence why I like to submit them in such way.

Is there an easy solution to submit them using sbatch in a way they would run sequentially, i.e., mapping.py only starts after trimming.py is done?

a121
  • 798
  • 4
  • 9
  • 20
  • If you submit that script, the `*.py` scripts will be run sequentially (unless the `*.py` scripts contain code that make them 'daemonize'). What leads you to believe they run simultaneously? – damienfrancois Mar 09 '21 at 12:41

0 Answers0