6

Summary

I'm creating a CI/CD provisioning pipeline for a new Azure Storage Account within an Azure DevOps Pipeline and attempting to upload some files into the Blob Storage using AzCopy running from an Azure Powershell task in the pipeline.

The Error

The script runs successfully from my local machine but when running in the Azure DevOps pipeline I get the following error (ErrorDateTime is just an obfuscated ISO 8601 formatted datetime):

Assumptions

  • The storage account has been setup to only allow specific VNet and IP Addresses access.
  • It looks like the firewall or credentials are somehow configured wrongly but the ServicePrincipal running the script has been used successfully in other sibling pipeline tasks and to understand these problems i've temporarily given the ServicePrincipal Subscription Owner permissions and the Storage account Firewall Rules tab has "Allow trusted Microsoft Services to access this storage account"

What I've tried...

  • I've successfully run the script from my local machine with my IP Address being in the allowed list.

  • If I enable "Allow access from All networks" on the Storage Account Firewall rules then the script runs and the file is uploaded successfully.

  • It appears as if the Azure Pipeline Agents running in their own VNet don't have access to my Storage Account but I would have thought that requirement would be satisfied by setting "Allow trusted Microsoft Services to access this storage account" in the Firewall settings

I'm using the following line within the Azure Powershell Task. I'm happy with the values because everything works when "All networks" or my IP Address is enabled and I run locally.

.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y

Any thoughts or guidance would be appreciated.

Thanks,

SJB

SJB
  • 177
  • 1
  • 9

4 Answers4

6

People seem to be getting mixed results in this github issue, but the AzureFileCopy@4 task works (at least for us) after adding the "Storage Blob Data Contributor" role to the ARM connection's service principal (to the storage account itself). The below is the only necessary step in a pipeline that deploys a repo as a static website in a blob container:

- task: AzureFileCopy@4
  displayName: 'Copy files to blob storage: $(storageName)'
  inputs:
    SourcePath: '$(build.sourcesDirectory)'
    Destination: AzureBlob
    storage: $(storageName)
    ContainerName: $web
    azureSubscription: 'ARM Connection goes here' # needs a role assignment before it'll work

(Of course, if you're using Azure CDN like we are, the next step is to clear the CDN endpoint's cache, but that has nothing to do with the blob storage error)

Matt
  • 12,569
  • 4
  • 44
  • 42
Matt Wanchap
  • 841
  • 8
  • 20
  • 2
    Worked here. DevOps create an enterprise application user inside Azure named like: **--**. This user has the role _"Storage Blob Data Contributor"_, but for _"resource group"_. **It is expected to work for all children resources, but it's not what happens.** **01.** In Azure Portal, go to Storage that receive my build artifacts. **02.** Go to Access Control (IAM). **03.** + Add > Role assignment. **04.** Select role "Storage Blob Data Contributor". **05.** Select user create by Azure Devops for your build/release pipeline. – gabrielbarceloscn Oct 22 '20 at 13:16
  • 1
    @gabrielbarceloscn so the service principal must have the role on the storage resource specifically, not a parent of the storage resource? Weird! Thanks for the info – Matt Wanchap Oct 22 '20 at 19:22
  • 1
    Hooray, the above steps from @gabrielbarceloscn added the permissions that Matt calls out in this answer and the copy to blob storage is working great now! – Josh Wittner May 12 '21 at 01:31
  • I don't know why this is not the accepted answer. Messing with access rights during deployment like SJB suggested should only be used for testing purposes and not in production. – user3838018 May 29 '21 at 07:31
5

After doing further research I noticed the following raised issue - that Azure DevOps isn't considered a trusted Microsoft Service from a Storage Account perspective.

My temporary workaround is to:

  • Setting the DefaultAction to Allow, thereby allowing "All networks access".
  • Setting the DefaultAction to Deny after the copy action ensured my VNet rules were being enforced again.
Try
{
    Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Allow
    .\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
}
Catch
{
    #Handle errors...
}
Finally
{
    Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Deny
}

Thanks,

SJB

SJB
  • 177
  • 1
  • 9
  • 2
    Thanks... I'm up against the same problem right now. This seems like a decent work around but it still feels odd to "make it public" (DefaultAction = Allow) for even a short time. Technically something could fail in between the Allow and Deny steps and then the storage account is left open. Anyone else find another solution? – Schwammy Sep 06 '19 at 15:47
  • @sjb did you ever find a better solution than this? (I have a bunch of small personal react projects that I can't justify paying to setup private networks, just to fix a bug in azure) ...was hoping you might have found a better way by now? – snowcode May 24 '20 at 22:04
  • Hi, unfortunately I'd not found a better workaround. However, I've not checked whether there is a better way in the last 3 months. I've also introduced some "post" checks to ensure that the vnet security is still honouring the intent based on @Schwammy comments. – SJB May 26 '20 at 09:33
0

In my case, the Service Principal from Azure Subscription selected in pipeline needed to have role of Storage Blob Data Contributor for the desired Storage Account where I wanted to copy files.

To do this, follow these steps:

  1. Open the DevOps pipeline, find the Azure subscription field and click on "Manage" button next to it
  2. Then click on "Manage Service Principal" and note the Display name (optionally changing it to something unique -- as all the service principals for service connections in a project have the same default name which can be confusing for the next step)
  3. Open the Storage Resource (Account or Container) in the Azure Portal, then select "Access Control (IAM)" in the blade on the left, then click "Add a role assignment" and add the "Storage Blob Data Contributor" role to the Service Principal you noted in previous step

If the "Add a role assignment" is disabled or greyed out, you need to ask your administrator to grant that access.

This solution was taken and improved from this GitHub comment.

B8ightY
  • 469
  • 5
  • 5
-2

Have you considered using the Azure DevOps Task "Azure File Copy" instead of a powershell script? see: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-file-copy?view=azure-devops

  • 1
    Hi, yes, I did try several tasks (and pretty sure the one you mentioned) before resorting to a powershell script but I faced similar 403 Forbidden errors. I'll need to try it again now that I'm aware of the workaround of setting the _DefaultAction_ to Deny. I'd prefer to rely on tasks that have been tested more thoroughly. The only downside I imagine is that i'll need 3 tasks to complete the work 1) Set Action to Deny 2) Use the Azure File Copy task 3) Set Action to Allow. – SJB Jul 07 '19 at 13:08