1

I've multi-stage azure build pipeline for my application, In the first stage, I'm building the source code and executing test.sh - this shell script create a new shell script file called data.sh. In my second stage, I'm executing run.sh which execute this newly created file data.sh but I'm getting No such file or directory error and my pipeline failed in the second stage.

The newly created file data.sh is not copied from my previous stage, how can I get data.sh into my second stage, please find my code below:

Much appreciated if you someone could help me to resolve this issue, I need data.sh file in my stage 2 which got created in stage 1

azure-pipeline.yml

trigger:
  - none

pool:
 vmImage: 'Ubuntu-latest'

stages:
  - stage: TestFilecopy
    displayName: Extecute test script 
    jobs:
      - job: myJob1
        displayName: Execute myJob1
        steps:
        - task: Bash@3
          inputs:
            targetType: filePath
            filePath: 'myApp/test.sh'
            workingDirectory: myApp

  - stage: TestFilecopy1
    displayName: Extecute new script 
    jobs:
      - job: myJob2
        displayName: Execute myJob2
        steps:
        - task: Bash@3
          inputs:
            targetType: filePath
            filePath: 'myApp/run.sh'
            workingDirectory: myApp

test.sh

#!/bin/bash
echo "hello word"
echo $PWD
ls -la

echo "#!/bin/bash
      echo 'Hello World'" > data.sh

ls -la

run.sh

echo "hello run file"
echo $PWD
ls -la
./data.sh

Folder Structure

product-pipeline-project/
├─ myApp/
│  ├─ run.sh
│  ├─ test.sh
├─ azure-pipeline.yml

Pipeline console output Stage 1 Output Log

Generating script.
Formatted command: exec bash '/home/vsts/work/1/s/myApp/test.sh'
========================== Starting Command Output ===========================
/usr/bin/bash --noprofile --norc /home/vsts/work/_temp/ef037ffc-0ea8-4366-b074-10120c9b1434.sh
hello word
/home/vsts/work/1/s/myApp
total 16
drwxr-xr-x 2 vsts docker 4096 Jul 11 18:25 .
drwxr-xr-x 4 vsts docker 4096 Jul 11 18:25 ..
-rw-r--r-- 1 vsts docker   48 Jul 11 18:25 run.sh
-rw-r--r-- 1 vsts docker  109 Jul 11 18:25 test.sh
total 20
drwxr-xr-x 2 vsts docker 4096 Jul 11 18:25 .
drwxr-xr-x 4 vsts docker 4096 Jul 11 18:25 ..
-rw-r--r-- 1 vsts docker   37 Jul 11 18:25 data.sh
-rw-r--r-- 1 vsts docker   48 Jul 11 18:25 run.sh
-rw-r--r-- 1 vsts docker  109 Jul 11 18:25 test.sh

Stage 2 Output Log

Generating script.
Formatted command: exec bash '/home/vsts/work/1/s/myApp/run.sh'
========================== Starting Command Output ===========================
/usr/bin/bash --noprofile --norc /home/vsts/work/_temp/087e12fd-0b7b-4bd5-a4a8-7b7879255f3f.sh
hello run file
/home/vsts/work/1/s/myApp
total 16
drwxr-xr-x 2 vsts docker 4096 Jul 11 18:25 .
drwxr-xr-x 4 vsts docker 4096 Jul 11 18:25 ..
-rw-r--r-- 1 vsts docker   48 Jul 11 18:25 run.sh
-rw-r--r-- 1 vsts docker  109 Jul 11 18:25 test.sh
/home/vsts/work/1/s/myApp/run.sh: line 4: ./data.sh: No such file or directory
##[error]Bash exited with code '127'.
Finishing: Bash
Datadog Learning
  • 105
  • 1
  • 1
  • 5

1 Answers1

2

Each job runs on separate machine. Thus file generate in one job is not accessible in another job. To make it available you need to publish as artifact and then download it. Here you have doc for that.

In your case it would be:

steps:
- publish: $(System.DefaultWorkingDirectory)/product-pipeline-project/myApp/data.sh
  artifact: WebApp

and then on another stage:

- task: DownloadPipelineArtifact@2
  inputs:
    artifact: 'WebApp'
    path: $(System.DefaultWorkingDirectory)/product-pipeline-project/myApp

But consider if you really need a sperate stage for that.

Krzysztof Madej
  • 32,704
  • 10
  • 78
  • 107
  • Thank you for the explanation @Krzysztof Madej. I'm having some sensitive PII data in my 'data.sh' file so publishing artifact is not a good option for my use case. i think - please suggest if any alterative option if you could think of. Thanks once again! – Datadog Learning Jul 11 '21 at 22:48
  • Well if you don't want to persisit this file please consider run in one single job. Why do you need seprate stages and jobs? – Krzysztof Madej Jul 11 '21 at 23:07