0

I have a parallel declaration like so:

parallel(first: {
    step(...)
}, second: {
    step(...)
})

Which works fine. Now, if I extend that by a function call:

def myFunc(num):
    sh """\
mkdir -p ${num}
"""
node('myspecialslave') {
    parallel(first: {
        step(...)
        myFunc(1)
    }, second: {
        step(...)
        myFunc(2)
    })
}

I no longer see first and second as being executed in parallel at all. So my question is - what criteria are there so the closure gets executed in parallel?

abergmeier
  • 13,224
  • 13
  • 64
  • 120

1 Answers1

0

You seem to run all of this parallel on the same node.

Instead, you want to allocate an executor within each of the parallel branches:

parallel(first: {
    node {
      step(...)
      myFunc(1)
    }
}, second: {
    node {
      step(...)
      myFunc(2)
    }
})

See also this article.

StephenKing
  • 36,187
  • 11
  • 83
  • 112
  • But I do want both `myFunc` calls executed on the current node. From the article in question the first example does it similar. – abergmeier Feb 02 '17 at 10:19
  • Not sure then, if all things can run in parallel. Did you try a parallel `sleep(30)`? – StephenKing Feb 02 '17 at 11:48
  • Maybe you can increase the number of executors on the node so that multiple jobs can run in parallel – tarunkt Feb 03 '17 at 06:02
  • As github.com/jenkinsci/pipeline-plugin/blob/master/TUTORIAL.md states: "Many steps (such as: git and sh in this example) can only run in the context of a node"..... Even if you run on the same machine (it can be achieved with more executors + labels), but parallelly, then different workspaces will be assigned, so there won't be any clash between the executions..... In fact the first example does not necessarily run on the same node. It runs on lightweight executors that are spawned on master agent (jenkins server). – Rafał S Apr 17 '17 at 22:03