0

I have a series of pipeline stages, with only one job per stage.

What is the best practice for having only one job per stage? Below I have my example yml setup:

trigger:
- main

resources:
- repo: self

stages:

# Test
##########################

- stage: Run_Tests
  displayName: Run Tests
  jobs:
  - job: Run_Tests
    displayName: Run Tests
    pool:
      vmImage: 'ubuntu-18.04'
    steps:
    # Testing Steps ...
  
# Build
##########################
- stage: Build
  displayName: Build
  jobs:
  - job: Build
    displayName: Build
    pool:
      vmImage: 'ubuntu-18.04'
    steps:
    # Build Steps ...

# Deploy
##########################
- stage: Deploy
  displayName: Deploy
  jobs:
  - deployment: VMDeploy
    displayName: Deploy
    # Deploy Steps ...

I have the below multiple times throughout the file.

jobs:
-jobs:

It seems so unnecessary and cluttered to me.

Am I just being pedantic, or is there a better way to do this?

1 Answers1

0

This could make sense when you use deployment job. It is because environment restrictions aplied on job level are evaluated on stage level. So if you combine Build, Test and Deploy stage and you have approval confogured on envrionemnt used in Deploy job you will be asked for approval before first job start.

For me Build and Test could go together and in fact they should be part of the same job as you may reuse - why because - is the change is valid when build is fine but tests are failed? You may also leverage fact that you already download dependencies for all your project. Having seprate job for test makes sense for me when we are speaking about integration tests.

Krzysztof Madej
  • 32,704
  • 10
  • 78
  • 107