4

I have a pipeline in Azure Devops and I'm currently trying to speed it up. Currently it is one job that runs a bunch of steps / tasks. The basic outline is a 1: build, 2: tests run, 3: symbol files published to symbol server, 4: nuget packages pack and push, 5: artifacts published.

3, 4, 5, just depend on a proper build and technically run concurrently. Issue is when I try to break them up into separate jobs (that depend on completion of the build) it doesn't work because the directories that get created during the build aren't available when the other jobs run.

Is there a way to make the working directories persist until all jobs in the pipeline are finished?

cybersnow1989
  • 229
  • 3
  • 17

2 Answers2

2

Is there a way to make the working directories persist until all jobs in the pipeline are finished?

No if you're using Microsft-Hosted agent, and Yes if you're using self-hosted agent.

For hosted agent: The document states that each time you run a pipeline, you get a fresh virtual machine. But it's actually each time you run a job, you get a fresh virtual machine. That's why the second job can't access the working directory for first job, they're different VMs.

So if you want to have a working directory for both two jobs, you can only use Self-hosted agents. You can check this issue for more details. Since those two jobs should run in same self agent, I suggest using demands to specify the agent you want to choose.

Note: We can't keep the working directories when using hosted agents, but we can use Publish Artifact task and Download Artifact task to share files between different jobs in different agents. Sample:

jobs:
- job: MyJob1
  continueOnError: true
  steps:
  - task: PublishBuildArtifacts@1
    inputs:
      PathtoPublish: '$(System.DefaultWorkingDirectory)'
      ArtifactName: 'drop'
      publishLocation: 'Container'
- job: MyJob2
  continueOnError: true
  dependsOn: MyJob1
  steps:
  - download: none
  - task: DownloadBuildArtifacts@0
    inputs:
      buildType: 'current'
      downloadType: 'single'
      artifactName: 'drop'
      downloadPath: '$(System.DefaultWorkingDirectory)'

I have a pipeline in Azure Devops and I'm currently trying to speed it up.

It's not recommended to break them up into separate jobs, I think it won't actually speed the process up. Your step 3.4.5 depend on completion of step 1.2, if you add 1.2 in Job1 and step 3.4.5 in Job2, you still can't make parallel jobs to speed it up cause the Job2 have to wait until Job1 finishes.

LoLance
  • 25,666
  • 1
  • 39
  • 73
  • I think even in self-hosting it won't work as you described. If the whole point is to re-use the created artifacts, having a workspace clean: outputs will wipe that created directory away. You might be on the same server in the same _work/1/ folder ... but it will have removed anything that you wanted to keep. – Matt Sep 11 '20 at 13:45
  • That's true, this is very close to what I needed but it looks like I may have to wait until Azure allows parallel tasks to be run which is an issue already raised (https://developercommunity.visualstudio.com/idea/365659/ability-to-run-tasks-in-parallel.html) – cybersnow1989 Sep 14 '20 at 16:27
  • @cybersnow1989 It looks we have to wait for some time before this feature request come true. – LoLance Sep 16 '20 at 10:04
  • @cybersnow1989 Since the feature above is not available for now, you can try using self-hosted agent to run the pipeline. So that your outputs are in same machine. – LoLance Sep 17 '20 at 02:02
0

I don't think the parallel jobs would necessarily be guaranteed to run on the same agent that processed 1 & 2. They might not even be on the same server, so it won't be able to reference what was done in steps 1 & 2. Usually if you wanted to do something in parallel you'd publish the pipeline or build artifacts, then have a step in the subsequent dependent job to pull them down. Since your tasks you want to make parallel are like the publishing, I'm not sure are going to really be able to have this run in parallel.

Matt
  • 3,658
  • 3
  • 14
  • 27