Problem:
$(Build.ArtifactStagingDirectory)
is empty, despite being able to view the published Artifacts in the Azure DevOps Pipeline UI, which causes the pipeline to fail.
What I'm trying to do:
- Build microservice (e.g.,
/api
). - Run unit tests.
- If unit tests are passing, publish the build as an Artifact.
- Dockerize the build Artifact using
buildContext
.
This is based on advice here, here, and here.
Publish Stage Config
My config for publishing after unit tests have passed is the following:
- publish: $(System.DefaultWorkingDirectory)/${{ parameters.pathName }}
artifact: ${{ parameters.pathName }}
condition: succeeded()
$(System.DefaultWorkingDirectory)
should be/home/vsts/work/1/s/
from what I gather.${{ parameters.pathName }}
is justapi
.- I can see correct artifacts are generated in the Azure DevOps Pipelines UI.
Docker buildAndPush
Stage Config
My config for grabbing the artifact and using it in a Docker buildAndPush
config is the following:
- task: Docker@2
condition: contains(variables['servicesChanged'], '${{ parameters.serviceName }}')
displayName: Build and Push ${{ parameters.pathName }} Docker image
inputs:
command: buildAndPush
repository: $(imageRepository)-${{ parameters.pathName }}
dockerfile: $(dockerfilePath)/${{ parameters.pathName }}/Dockerfile
buildContext: $(Build.ArtifactStagingDirectory)
containerRegistry: $(dockerRegistryServiceConnection)
tags: |
${{ parameters.tag }}-${{ parameters.tagVersion }}
- From what I gather,
$(Build.ArtifactStagingDirectory)
should be/home/vsts/work/1/a/
. - However, it is empty and this stage fails.
$(dockerfilePath)
is equal to$(Build.SourcesDirectory)
.
Dockerfile
Config
Informational, but this is what the Dockerfile
contains:
FROM python:3.8-slim
WORKDIR /app
EXPOSE 5000
COPY . .
RUN pip install -r requirements.txt
CMD ["gunicorn", "-b", ":5000", "--log-level", "info", "config.wsgi:application", "-t", "150"]
Project Structure
/project-root
/admin
package.json
Dockerfile
/api
requirements.txt
Dockerfile
/client
package.json
Dockerfile
What I've Tried
dockerfile: $(dockerfilePath)/${{ parameters.pathName }}/Dockerfile
buildContext: $(Build.ArtifactStagingDirectory)
Step 5/17 : RUN pip install -r requirements.txt
---> Running in 277ce44b61cf
ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements.txt'
The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
##[error]The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
##[error]The process '/usr/bin/docker' failed with exit code 1
dockerfile: $(dockerfilePath)/${{ parameters.pathName }}/Dockerfile
buildContext: $(Build.ArtifactStagingDirectory)/${{ parameters.pathName }}
unable to prepare context: path "/home/vsts/work/1/a/api" not found
##[error]unable to prepare context: path "/home/vsts/work/1/a/api" not found
##[error]The process '/usr/bin/docker' failed with exit code 1
dockerfile: $(Build.ArtifactStagingDirectory)/${{ parameters.pathName }}/Dockerfile
buildContext: $(Build.ArtifactStagingDirectory)
##[error]Unhandled: No Dockerfile matching /home/vsts/work/1/a/api/Dockerfile was found.
dockerfile: $(Build.ArtifactStagingDirectory)/Dockerfile
buildContext: $(Build.ArtifactStagingDirectory)
##[error]Unhandled: No Dockerfile matching /home/vsts/work/1/a/Dockerfile was found.
What Has Worked
dockerfile: $(System.DefaultWorkingDirectory)/${{ parameters.pathName }}/Dockerfile
buildContext: $(System.DefaultWorkingDirectory)/${{ parameters.pathName }}
But doing this seems to negate the need to publish
as an Artifact. Maybe this is the "correct" way, I don't know. It seems like it is doing what I want to accomplish by COPY
what was built for unit testing into the Docker image instead of using a different version.
I'm pretty sure this isn't what I'm after since it looks like it is just cloning the repo again to /home/vsts/work/1/a/
at the beginning of this stage.
Question(s)
- Why is
$(Build.ArtifactStagingDirectory)
empty? - Is it a deprecated env var?
- Is using what I have in "What Has Worked" supposed to be the correct way of how to handle this? (I don't think that it is).
- So how should I be persisting the tested build between the unit testing stage and the Docker stage so I can use the exact build from the unit testing stage?