1

So....

I was wondering if anyone is able to assist me on this one?

Basically I have created a self-hosted docker container for use to use as a build agent (Azure DevOps)

Now I have started to test the agent, and I am having issues being able to publish artefacts to the destination due to the fact our drop folder is on a windows file share (domain joined)

Is anyone able to shed some light on how I can go about being able to access this file share to drop artefacts from the build and also to be able to download them for the release stage?

We use on-prem TFS Azure DevOps rather than Azure DevOps Services :(

TIA,

Update

In the end, I was able to resolve this issue.

What I did to resolve this issue was I created a gMSA account on the domain controller, then created a credspec file on the machine hosting the docker container, then finally I ran the container using the following command docker run --security-opt "credentialspec=file://<credspecfilename>.json" --hostname <hostname> -it <Image-name> PowerShell

Once the container was up and running I was able to confirm that the directories were available by running the following command dir <server>\\<share> - Also I had to make sure that the newly created gMSA account had permissions to the share.

So I continued to join the container to the agent pool of our DevOps server and then I ran a test build. As expected I was able to pull/publish artefacts from our on-prem domain-joined server.

Again, thanks guys for assisting me on this issue.

Community
  • 1
  • 1
Matt Taylor
  • 189
  • 1
  • 2
  • 10
  • 2
    Don't publish to a file share. Publish to the server. The ability to publish artifacts to a file share is mostly included for legacy reasons; there is not a compelling reason to do so. – Daniel Mann Feb 18 '20 at 15:46
  • Hi Matt, Glad to hear this. Always better when you fix it yourself; as you understand how it works!:) You could move the update part in the question to below as a new reply. Then *mark your own reply as answer*. Which will also help others in the community. – PatrickLu-MSFT Mar 05 '20 at 12:14

1 Answers1

1

You could use Build.ArtifactStagingDirectory, just as Daniel mentioned in the comment, publish your build artifacts to server.

Build.ArtifactStagingDirectory

The local path on the agent where any artifacts are copied to before being pushed to their destination. For example: c:\agent_work\1\a

A typical way to use this folder is to publish your build artifacts with the Copy files and Publish build artifacts tasks.

Details steps you could refer Thomas F. Abraham's reply in this question: How to copy azure pipeline artifacts to a docker image which is microsoft dotnetcore runtime image

  1. Add a Docker task after the publish, configured to "Build"
  2. Set the "Build Context" in the task to $(Build.ArtifactStagingDirectory). That's the root path Docker will use for commands like COPY in a Dockerfile.
  3. Commit a Dockerfile to your repo, and set the Dockerfile path in the task to match its location

Set up the Dockerfile like this (I'm assuming .NET Core 2.2 here): FROM mcr.microsoft.com/dotnet/core/aspnet:2.2 WORKDIR /app COPY . . ENTRYPOINT ["dotnet", "myAppNameHere.dll"] Because you've set the Docker Build Context to $(Build.ArtifactStagingDirectory), where your app has been published, the COPY command will use that as a "current working directory." The translation of the COPY is "copy everything in $(Build.ArtifactStagingDirectory) to the /app folder inside the container."

That'll get you a basic Docker container that simply contains your pre-built and published app files.

Besides, you could also choose to publish these artifacts as a Nuget package. For this way, you could kindly refer this blog: Accessing Azure Artifacts from a docker container in a Pipelines build.

PatrickLu-MSFT
  • 49,478
  • 5
  • 35
  • 62