85

I am building an Angular 6 application that will be able to make CRUD operation on Azure Blob Storage. I'm however using postman to test requests before implementing them inside the app and copy-pasting the token that I get from Angular for that resource.

When trying to read a file that I have inside the storage for test purposes, I'm getting: <Code>AuthorizationPermissionMismatch</Code> <Message>This request is not authorized to perform this operation using this permission.

  • All in production environment (although developing)
  • Token acquired specifically for storage resource via Oauth
  • Postman has the token strategy as "bearer "
  • Application has "Azure Storage" delegated permissions granted.
  • Both the app and the account I'm acquiring the token are added as "owners" in azure access control IAM
  • My IP is added to CORS settings on the blob storage.
  • StorageV2 (general purpose v2) - Standard - Hot
  • x-ms-version header used is: 2018-03-28 because that's the latest I could find and I just created the storage account.
SebastianG
  • 8,563
  • 8
  • 47
  • 111
  • so users need IAM permissions on the blob storage ??? – Thomas Oct 12 '18 at 07:58
  • 1
    @Thomas I have no idea mate, I just added it anyway to be sure that's not the reason. – SebastianG Oct 12 '18 at 09:39
  • 1
    For anyone having trouble with a similar issue and the answers aren't helping, try using the "Diagnose and solve problems" tool in the Azure portal sidebar for your storage account. It will help you look through your logs to see what's going on. With it, I found that I was using a different service principal than I thought I was using. – MakotoE Mar 03 '23 at 05:52

9 Answers9

192

I found it's not enough for the app and account to be added as owners. I would go into your storage account > IAM > Add role assignment, and add the special permissions for this type of request:

  • Storage Blob Data Contributor
  • Storage Queue Data Contributor
Drakinite
  • 363
  • 1
  • 13
JimJohnBobJohnes
  • 1,946
  • 1
  • 12
  • 10
  • 8
    [Link](https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-app) to the documentation – nldev May 07 '19 at 09:42
  • 10
    For me it worked only after adding role "storage queue data contributor" – user2526641 Apr 23 '20 at 08:46
  • 3
    Depending on what action your performing you may need "Storage Blob Data Owner" instead of contributor. But it does appear that being a Owner of the resource doesn't override this – Liam Sep 17 '20 at 15:21
  • +1 to @user2526641, it worked after adding "Storage Blob Data Contributor" and "Storage Queue Data Contributor" – Drakinite Jan 20 '21 at 00:52
  • 1
    if you're not sure, **what** you'd need to add in IAM when deploying from a DevOps pipeline: it should the service principal name (AAD app name) of your DevOps service connection. – jetnet Mar 09 '21 at 16:28
  • 1
    Microsoft really messed up when they migrated from Microsoft.WindowsAzure.Storage to Microsoft.Azure.Storage and then to Azure.Storage. It's like the left hand doesn't know what the right is doing over there in Redmond! Anyways this helped solve my problem of why my Azure Queues were not working after I migrated to Azure.Storage! – ScottyMacDev Dec 16 '21 at 17:01
  • It worked after I added permission of 'Storage Blob Data Contributor' in order to access blob objects. I found the following article has detailed explanation which explain why accessing blob object is working in Azure portal but not with Azure AD user authentication: https://www.schaeflein.net/understanding-azure-storage-data-access-permissions/ – wei Jan 06 '22 at 08:54
  • One additional thing I ran into and wanted to share is that I first added the IAM role assignment for the app but was still getting the error so then added it for the user who authorized the app (me in this case) and also clicked the "grant admin consent" on the app API permissions. I'm not sure which of these was the key, but after doing both the error went away and the access now works. – theark40 Mar 16 '22 at 22:40
  • I've recently needed `Storage Blob Data Contributor` to fix two separate issues. Even though my user was the `Owner`(!) of the Storage account, I had to add the additional right. Use cases: Immutable storage and, separately, User Delegation Tokens. – JsAndDotNet May 10 '22 at 13:27
  • Any link to justify why users still need those roles even after being Contributors ? – Prathamesh dhanawade Sep 08 '22 at 17:54
13

Make sure to use Storage Blob Data Contributor and NOT Storage Account Contributor where the latter is only for managing the actual Storage Account and not the data in it.

Fjurg
  • 487
  • 3
  • 10
4

Be aware that if you want to apply "STORAGE BLOB DATA XXXX" role at the subscription level it will not work if your subscription has Azure DataBricks namespaces:

If your subscription includes an Azure DataBricks namespace, roles assigned at the subscription scope will be blocked from granting access to blob and queue data.

Source: https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-rbac-portal#determine-resource-scope

sylr
  • 93
  • 2
  • 6
4

I've just solved this by changing the resource requested in the GetAccessTokenAsync method from "https://storage.azure.com" to the url of my storage blob as in this snippet:

    public async Task<StorageCredentials> CreateStorageCredentialsAsync()
    {
        var provider = new AzureServiceTokenProvider();
        var token = await provider.GetAccessTokenAsync(AzureStorageContainerUrl);
        var tokenCredential = new TokenCredential(token);
        var storageCredentials = new StorageCredentials(tokenCredential);
        return storageCredentials;
    }

where AzureStorageContainerUrl is set to https://xxxxxxxxx.blob.core.windows.net/

Liam
  • 5,033
  • 2
  • 30
  • 39
2

Make sure you add the /Y at the end of the command.

double-beep
  • 5,031
  • 17
  • 33
  • 41
1

Used the following to connect using Azure AD to blob storage: This is code uses SDK V11 since V12 still has issues with multi AD accounts See this issue https://github.com/Azure/azure-sdk-for-net/issues/8658 For further reading on V12 and V11 SDK

https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet-legacy

https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet

using Microsoft.Azure.Services.AppAuthentication;
using Microsoft.Azure.Storage.Auth;
using Microsoft.Azure.Storage.Blob;
using Microsoft.Azure.Storage.Queue;

[Fact]
public async Task TestStreamToContainer()
        {
            try
            {
                var accountName = "YourStorageAccountName";
                var containerName = "YourContainerName";
                var blobName = "File1";
                var provider = new AzureServiceTokenProvider();
                var token = await provider.GetAccessTokenAsync($"https://{accountName}.blob.core.windows.net");
                var tokenCredential = new TokenCredential(token);
                var storageCredentials = new StorageCredentials(tokenCredential);

                string containerEndpoint = $"https://{accountName}.blob.core.windows.net";

                var blobClient = new CloudBlobClient(new Uri(containerEndpoint), storageCredentials);
                var containerClient = blobClient.GetContainerReference(containerName);
                var cloudBlob = containerClient.GetBlockBlobReference(blobName);


                string blobContents = "This is a block blob contents.";
                byte[] byteArray = Encoding.ASCII.GetBytes(blobContents);

                using (MemoryStream stream = new MemoryStream(byteArray))
                {
                    await cloudBlob.UploadFromStreamAsync(stream);
                }
            }
            catch (Exception e)
            {
                Console.WriteLine(e.Message);
                Console.ReadLine();
                throw;
            }
        }
OmerT
  • 11
  • 4
0

I had this issue when using Terraform on an Azure Pipeline (TerraformTaskV3@3) and got this error

│ Error: Failed to get existing workspaces: containers.Client#ListBlobs: Failure responding to request: StatusCode=403 -- Original Error: autorest/azure: Service returned an error. Status=403 Code="AuthorizationPermissionMismatch" 

Adding the Storage Blob Data Contributor role to the service principal used to connect AzureDevops to the Azure Subscription fixed the issue.

Related issue: Error: Failed to get existing workspaces: containers.Client#ListBlobs:

0

I had a similar problem with the Azure Pipeline task AzureFileCopy@5.

To solve this problem, I went to Azure > Storage account > Access Control (IAM) and added the Storage Blob Data Contributor role to the applications that already had the Contributor role.

This role can be added at resource or subscription level (Documentation).

Capture of the status once the roles have been added in Azure Pipeline

  • Similar answer is already posted, and has been accepted by the author. Please edit the answer in case the answer provides extra information. – ranka47 Aug 17 '23 at 17:23
0

Just adding visual presentation the solution that JimJohnBobJohnes answered.

  1. Go to Storage account > IAM > Add role assignment.
  2. In 'Add Role Assignment', select these two roles
  • Storage Blob Data Contributor
  • Storage Queue Data Contributor
  1. Assign it to your account.
  2. These should reflect on your storage account (green highlight)

enter image description here

SimplifyJS
  • 518
  • 1
  • 6
  • 12