0

Original question (check update in next section)

I would like to download files that are produced by multiple jobs into one folder on azure pipelines. Here is a schema of what I'd like to accomplish:

jobs:                                                                           
 - job: job1                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.1

 - job: job2                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job2\n" > $(Pipeline.Workspace)/file.2 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.2

 - job: check_prev_jobs
   dependsOn: "all other jobs"
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          mkdir -p  $(Pipeline.Workspace)/previous_artifacts
      - task: DownloadPipelineArtifact@2                                         
        inputs:
            source: current
            path: $(Pipeline.Workspace)/previous_artifacts       

Where the directory $(Pipeline.Workspace)/previous_artifacts only contains file.1 and file.2 and does not have directories job1 and job2 that contain /file.1 and /file.2 respectively.

Thanks!

Update

Using @Yujun Ding-MSFT's answer. I created the following azure-pipelines.yml file:

stages:
- stage: generate
  jobs:
    - job: Job_1
      displayName: job1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job1\n" > $JOB_NAME.time
          printf "Hash form job1\n" > $JOB_NAME.hash
          printf "Raw form job1\n" > $JOB_NAME.raw
          printf "Nonesense form job1\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(DIR)
          artifactName: job1
      
    - job: Job_2
      displayName: job2
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job2\n" > $JOB_NAME.time
          printf "Hash form job2\n" > $JOB_NAME.hash
          printf "Raw form job2\n" > $JOB_NAME.raw
          printf "Nonesense form job2\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(DIR)
          artifactName: job2

- stage: analyze
  jobs:
    - job: download_display
      displayName: Download and display
      pool:
        vmImage: ubuntu-20.04
      variables:
          DIR: $(Pipeline.Workspace)/artifacts
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          path: $(DIR)
          patterns: '**/*.time'
      - bash: |
          ls -lR $DIR
          cd $DIR
        displayName: Check dir content

However, as shown on the screenshot below, I still get each .time file in a separate job-related directory:

enter image description here

Unfortunately, it seems to me that what I would like may not possible with Pipeline.Artifacts as explained in this Microsoft discussion. This would be a bummer given that Build.Artifacts are deprecated at this point.

gomfy
  • 637
  • 8
  • 16

1 Answers1

3

In your current situation, we recommend you can add the keyword:artifactName to your publishArtifact task. I modified your script and test on my side. Hope this will help you:

trigger: none

# pool:
#   vmImage: ubuntu-latest

jobs:
- job: Job_1
  displayName: job 1
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
    persistCredentials: True
  - task: Bash@3
    displayName: Bash Script
    inputs:
      targetType: inline
      script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 '
  - task: PublishPipelineArtifact@1
    displayName: Publish Pipeline Artifact
    inputs:
      path: $(Pipeline.Workspace)/file.1
      artifactName: job1
  
- job: Job_2
  displayName: job2
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
  - task: Bash@3
    displayName: Bash Script copy
    inputs:
      targetType: inline
      script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.2 '
  - task: PublishPipelineArtifact@1
    displayName: Publish Pipeline Artifact copy
    inputs:
      path: $(Pipeline.Workspace)/file.2
      artifactName: job2
      
- job: Job_3
  displayName: Agent job
  dependsOn:
  - Job_1
  - Job_2
  pool:
    vmImage: ubuntu-20.04
  steps:
  - checkout: self
  - task: Bash@3
    displayName: Bash Script
    inputs:
      targetType: inline
      script: ' mkdir -p  $(Pipeline.Workspace)/previous_artifacts'
  - task: DownloadPipelineArtifact@2
    displayName: Download Pipeline Artifact
    inputs:
      path: '$(Pipeline.Workspace)/previous_artifacts   '

Attach my test result: enter image description here enter image description here

Update: Because of the jobs are running with different sessions. So we can not just copy file or use publish artifact to help us merge the both job artifacts. I modified your yaml file and this might help you:

stages:
- stage: generate
  jobs:
    - job: Job_1
      displayName: job1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job1\n" > $JOB_NAME.time
          printf "Hash form job1\n" > $JOB_NAME.hash
          printf "Raw form job1\n" > $JOB_NAME.raw
          printf "Nonesense form job1\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(DIR)
          artifactName: job1

    - job: Job_2
      displayName: job2
      dependsOn: 
      - Job_1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job2\n" > $JOB_NAME.time
          printf "Hash form job2\n" > $JOB_NAME.hash
          printf "Raw form job2\n" > $JOB_NAME.raw
          printf "Nonesense form job2\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          buildType: 'current'
          artifactName: 'job1'
          targetPath: '$(DIR)'
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(DIR)
          artifactName: job2
 
- stage: analyze
  jobs:
    - job: download_display
      displayName: Download and display
      pool:
        vmImage: ubuntu-20.04
      variables:
          DIR: $(Pipeline.Workspace)/artifacts
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          buildType: 'current'
          artifactName: 'job2'
          itemPattern: '**/*.time'
          targetPath: '$(DIR)'
          
      - bash: |
          ls -lR $DIR
          cd $DIR
          cd $(System.ArtifactsDirectory)
        displayName: Check dir content

attach the build result: enter image description here

Felix
  • 1,104
  • 3
  • 6