Is there a way to make the working directories persist until all jobs
in the pipeline are finished?
No if you're using Microsft-Hosted agent, and Yes if you're using self-hosted agent.
For hosted agent: The document states that each time you run a pipeline, you get a fresh virtual machine
. But it's actually each time you run a job, you get a fresh virtual machine
. That's why the second job can't access the working directory for first job, they're different VMs.
So if you want to have a working directory for both two jobs, you can only use Self-hosted agents. You can check this issue for more details. Since those two jobs should run in same self agent, I suggest using demands to specify the agent you want to choose.
Note: We can't keep the working directories when using hosted agents, but we can use Publish Artifact task and Download Artifact task to share files between different jobs in different agents. Sample:
jobs:
- job: MyJob1
continueOnError: true
steps:
- task: PublishBuildArtifacts@1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
- job: MyJob2
continueOnError: true
dependsOn: MyJob1
steps:
- download: none
- task: DownloadBuildArtifacts@0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'drop'
downloadPath: '$(System.DefaultWorkingDirectory)'
I have a pipeline in Azure Devops and I'm currently trying to speed it
up.
It's not recommended to break them up into separate jobs, I think it won't actually speed the process up. Your step 3.4.5 depend on completion of step 1.2, if you add 1.2 in Job1 and step 3.4.5 in Job2, you still can't make parallel jobs to speed it up cause the Job2 have to wait until Job1 finishes.