0

I have many folders that contain .sh files(as shown below). I need to run each one of them on the cluster. I need help on how to put the following sbatch command into a loop.

cd file1
sbatch run_min.sh
sbatch run_eqbr.sh 
sbatch run_prod.sh
cd ..



 cd file2
    sbatch run_min.sh
    sbatch run_eqbr.sh 
    sbatch run_prod.sh

Folder structure

 folder1
     run_min.sh 
     run_eqbr.sh
     run_prod.sh

    folder2 
     run_min.sh
     run_eqbr.sh
     run_prod.sh

    folder3 
     run_min.sh
     run_eqbr.sh
     run_prod.sh

    folder4 
     run_min.sh
     run_eqbr.sh
     run_prod.sh
shome
  • 1,342
  • 1
  • 12
  • 30

2 Answers2

0

This worked.

for dir in */; 
do sbatch run_min.sh;
sbatch run_eqbr.sh
sbatch run_prod.sh ; done
shome
  • 1,342
  • 1
  • 12
  • 30
0

Try this.

for folder in ./*/; do
  ( cd "$folder"
    for file in ./*; do
      sbatch "$file"
    done )
done

The parentheses cause the cd etc to run in a subshell so you don't have to cd back after each (this can fail in some situations when the destination directory is a symlink).

If the files do not require to be run in a particular directory, maybe simply

for file in ./*/*; do
    sbatch "$file"
done

Competently written scripts do not care which directory they are being run from, but of course, these could be simple ad hoc scripts which e.g. expect a specific input data file to exist in the current directory.

If you can't just run all the files in all directories, maybe change the wildcard to ./*/*.sh or explicitly loop over the specific file names:

for folder in ./*; do
  for name in min eqbr prod; do
    sbatch "$folder/run_$name.sh"
  done
done

Again, if the scripts require you to cd into each folder, you can combine this with the code in the first example.

tripleee
  • 175,061
  • 34
  • 275
  • 318