I'm working on a nextflow pipeline, that uses a different docker container in each process.
It looks something like this:
params.exeA = "$projectDir/bin/exeA" //is inside docker volume of containerA
params.exeB = "$projectDir/bin/1.0/exeB" //is docker volume of containerB
params.inA = "$projectDir/TestData/A/*"
params.outA = "$projectDir/TestData/A_Out/"
params.outB = "$projectDir/TestData/B_Out/*"
process A{
container 'link.to.container.registryA'
input:
path t_path
path out_path_A
//some more parameters
output:
path "${out_path_A}"
script:
"""
${params.exeA} --input "${t_path}" --output "${out_path_A}" //some more parameters
"""
}
process B{
container 'link.to.container.registryB'
input:
path resultOfA
output:
path ...
script:
"""
${params.exeB} --someVal "${params.someVal}" --pathA "${resultOfA}"
"""
}
workflow {
files_ch = Channel.fromPath(params.input_path, type: 'dir')
a_ch = A(files_ch, params.outA)
b_ch = B(a_ch, params.outB)
}
When I try to run it, it throws the error:
Error executing process > 'A (1)'
Caused by:
Process `A (1)` terminated with an error exit status (127)
Command executed:
/home/projects/myProject/bin/exeA ...(some parameters)
Command exit status:
127
Command output:
(empty)
Command error:
.command.sh: line 2: /home/projects/myProject/bin/exeA: No such file or directory
Confusingly, it still created the output-File of exeA I was expecting inside a workingDir. But the workflow stops at that point and doesn't continue with process B. I don't know how this is possible... Nextflow can't find the executable, but was still able to call it???
I already tested both processes individually and had the same error.