0

I want to run N files (N jobs) that are inside N folders that are in my pwd such :

Folder_1
   contains file_1
Folder_2
   contains file_2
|
|
|
Folder_N
   contains file_N

For one file_1 i just have to do : sbatch script.sh ./folder1/file_1.

But is there a way to make a loop for running the N files like :

for i in range(N):
sbatch script.sh ./folder_i/file_i
  • `for i in {1..666}`, or `for i in $(seq 1 "$n")` – Poshi Apr 06 '22 at 14:52
  • that's all ? i will try it tomorrow how do i run the file where i put the command ? – haswellrefresh Apr 06 '22 at 17:44
  • I was assuming `bash`, you will need a little bit of bash knowledge to make it work as I only gave you the basic bits of information. If you put the loop (with the full syntax) in a file, you can run it by adding a shebang and making it executable or by passing the script as a parameter to bash. – Poshi Apr 06 '22 at 19:14
  • You asked for the loop and I gave you the loop. It's your responsibility to perform the variable substitutions to create the adequate names and to polish the details. Also, a SLURM job array could be a good suit for your problem, but this needs even more knowledge to set up (not much, but some more). – Poshi Apr 06 '22 at 19:17
  • ok thank you i know just bash for my alias. i will try learning more – haswellrefresh Apr 06 '22 at 20:53

2 Answers2

0

Create a bash file as such :

#!/bin/bash
for i in {305..595..5} (or any range you want)
  do 
     sbatch script.sh ./folder_$i/file_$i
 done

make it executable via this link https://www.andrewcbancroft.com/blog/musings/make-bash-script-executable/ and it runs.

0

I found a solution i just want to share it here,

#!/bin/bash
ls ./folder*/file* > file (stock paths in a file)
  while read p; do (read it line by line)
   sbatch script.sh "$p"
  done < file