0

I now have this script

#!/bin/bash
#SBATCH -t 12:00:00
#SBATCH -N 1
#SBATCH --tasks-per-node 81
#SBATCH -p partition
#SBATCH -A user
#SBATCH -a 0-3

module load gcc/9.3.0
module load intel/2021.2
module load impi/2021.2
module load cp2k/9.1

source $CP2K_TOOLCHAIN/setup

STRING="supercell"

for dir in */;do
   cd $dir
   for input_file in *.inp;do
     if [[ "$input_file" == *"$STRING"* ]];then
        srun cp2k.popt ${input_file[$SLURM_ARRAY_TASK_ID]} > $(basename ${input_file} .inp).log
     fi
   done
   cd ..
done

and then I submit it using sbatch ..

the result is that I will find my simulations with cp2k running 3 times and I don't know why. The geometry optimization is completed but still goes into another optimization until the wall time is over. I don't understand what is happening.

I tried nearly all the solutions available online regarding array jobs but the above script was the best thereof. The only problem is that the calculations are being repeated over and over again. I have 4 folders and 4 array jobs in that script.

My end goal is to use array jobs for 84 different input files in different directories.

loom
  • 1
  • 3

0 Answers0