0

I am trying to parallel a process executing subprocess.Popen un order to connect with sbatch, following the next squeme:

>for i in jobs:
>   cmd=f'\ 
>   sbatch --jobname={i} --mem=3G \
>   --cpus-per-task=4 --output=logs/output/{i}.out \ 
>   --error=jobs/error/{i}.err  parallel.sh {i}'
>   process=subprocess.Popen(cmd,shell=True,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
>   stdout,stderr=process.communicate()

How could I write my following code in order to wait for all the process to end but not destroying the parrallel execution?

Thanks in advance!!

I have been trying to get as one of the parallel.sh execution process outputs a file containing as many lines as processes executed and making a infinite loop until this file contains thhis expected number of lines. It seems ver rudimentary for me, so that is why I am begging for help

  • Pass a `timeout` to [`communicate`](https://docs.python.org/3/library/subprocess.html#subprocess.Popen.communicate), catch the (mostly-)expected [`TimeoutExpired`](https://docs.python.org/3/library/subprocess.html#subprocess.TimeoutExpired), and the continue with the next one. – Lenormju Mar 06 '23 at 14:19

0 Answers0