I'm in the process of setting up Jenkins pipeline jobs for our build. We previously used Freestyle jobs to build each of our repos, with various extra manually triggered jobs at the packaging and deployment stages.
We have the following Pipeline setup so far:
- repo A/Jenkinsfile - Pipeline A - build, runs unit tests and produces artifact
- repo B/Jenkinsfile - Pipeline B - build, runs unit tests and produces artifact
Whenever Pipeline A or B completes I want to run Pipeline C.
- Pipeline C should take the artifacts of the latest A and B pipelines, package them together as a docker image, run integration tests, deploy to staging environment.
Where should I store the Jenkinsfile for Pipeline C since it does not have it's own source repo?
To me this logically sits as a separate pipeline that is downstream from A and B so assume this belongs in a separate Jenkinsfile but then where would that go in source control. To clarify I would like to have all my pipeline files under source control.
Alternatively should I try to define the whole pipeline (including integration, etc.) in just one of the parent Pipelines, skipping the build part when it is triggered from the other pipeline? Leaving the other as a simple build. For example
Pipeline A (full pipeline)
- build
- unit tests
- artifact
- (get latest B artifact)
- integration test
- deploy to staging
Pipeline B (simple build that then uses full pipeline)
- build
- unit tests
- artifact
- trigger Pipeline A, skipping everything up to integration test step (potentially using
when
blocks on those stages that look at the build trigger?)
I can find resources on either building or triggering one pipeline from another like this: https://metamorphant.de/blog/posts/2019-03-11-jenkins-101-downstream-projects/ but no information on structuring the above kind of dependencies and pipelines in Jenkins Pipeline.