4

I have three pipeline projects, project-a, project-b and project-c. project-c takes a parameter. On successful completion of either project-a or project-b I want to trigger a build of project-c with a parameter.

I can do this in project-a and project-b with this code in the pipeline:

stage('trigger-project-c') {
    def job = build job: 'project-c', parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: 'somevalue']]
}  

But this requires two executors. I want project-a or project-b to completely finish before running project-c with parameters.

Mark Allison
  • 6,838
  • 33
  • 102
  • 151

2 Answers2

5

Your pipeline most likely looks like this:

node {
  stage('build') {
    // sh "make"
  }

  // ...

  stage('trigger-project-c') {
      def job = build job: 'project-c', parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: 'somevalue']]
  }
}

By wrapping everything inside the node closure, the downstream job project-c is triggered inline, without the upstream job being paused / releasing an executor.

Therefore, things that do essentially nothing for a long time should not be wrapped within a node step, in order to not block an executor. A very similar case is when using the input step to wait for user feedback.

Instead, your pipeline should look e.g. as follows, which is - so to say - the best practice (as you don't block your executor):

stage('build') {
  node {
    // sh "make"
  }
}

// or 

node {
  stage('build') {
    // sh "make"
  }

  stage('unit') {
    // sh "make"
  }
} // node

// note: the following code is _not_ wrapped inside a `node` step 
stage('trigger-project-c') {
  def job = build job: 'project-c', parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: 'somevalue']]
}

There is no need to wrap the build step within a node, i.e., block an executor for it. For other steps (like sh), pipeline execution would trigger an error and remind you that it cannot be run outside of a node allocation.

Community
  • 1
  • 1
StephenKing
  • 36,187
  • 11
  • 83
  • 112
2

Add the parameter wait: false to the build step to continue pipeline execution without waiting for the downstream job.

EDIT: this would help, if you don't care for the success of the other downstream job for this pipeline. If you need to wait until it is finished and then continue the own (upstream job's) pipeline, then see my other answer.

StephenKing
  • 36,187
  • 11
  • 83
  • 112
  • The downstream job continues execution with this, but does not release the executor to run the next project. So if I set Jenkins to only have one executor and run `project-a`, the job never completes because `project-c` requires it's own executor. – Mark Allison Jan 07 '17 at 23:06
  • Then you need to change order of `node` and `stage`. Don't run `build` within a `node` closure, but within a flyweight executor. – StephenKing Jan 08 '17 at 06:55
  • Thanks this works fine, I'm a little worried that I'm breaking best practices by running a project on a flyweight executor. Do you know of another way to achieve this without the flyweight executor? I guess I'm really looking for something like a parameterized trigger plugin - the ones I've tried so far don't work. – Mark Allison Jan 08 '17 at 12:21
  • It's the other way around ;-) See my other answer. – StephenKing Jan 08 '17 at 13:39