29

How can I check how much space I used in each of my azure storage accounts in my subscription resource group wise.

I am not able to find a way to check space used in azure storage account through PowerShell, CLI, portal...

Kirill Kobelev
  • 10,252
  • 6
  • 30
  • 51
Suresh Guntha
  • 391
  • 1
  • 3
  • 6
  • Looking at billing will show you daily charges, which is ultimately what really matters. It may reveal other surprises and you can sort all your resources by cost. Also note that the newer version 2 pricing is (mostly) cheaper than version 1 so check you're on the latest version. – Simon_Weaver Aug 22 '18 at 23:49

10 Answers10

21

Azure Storage size consist of all of 4 services (Blob, Queue,File, Table) altogether. Based on my knowledge, there are no ways to calculate the total size of all services for now.

However, you could get blob space used on Portal by using Azure metrics. Please select Monitor-->Metrics

enter image description here

More information about monitor a storage account in the Azure portal please refer to this link.

Also, you could use PowerShell to get your blob usage. There is a good script you could use.

Shui shengbao
  • 18,746
  • 3
  • 27
  • 45
13

I found this article so relevant after a lots of search:

https://techcommunity.microsoft.com/t5/azure-paas-blog/calculate-the-size-capacity-of-storage-account-and-it-services/ba-p/1064046

Use Azure Monitor to check the capacity of the storage accounts. Steps:

  • Navigate to Azure Monitor
  • Click on Storage Accounts
  • Click on the Capacity. You can see all the accounts and used capacity side-by-side here.

enter image description here

Sunny Sharma
  • 4,688
  • 5
  • 35
  • 73
11

Azure Storage Explorer has a 'Directory Statistics' button.

Navigate to a folder

enter image description here

Click the button

enter image description here

The total is shown in the activities panel

enter image description here

Simon_Weaver
  • 140,023
  • 84
  • 646
  • 689
  • For blobs it's called 'Folder Statistics'. This example is for File Shares. – Simon_Weaver Aug 22 '18 at 23:55
  • If i use this method (Azure Storage Explorer) it shows about 2TiB but when using Monitor y Azure portal (https://stackoverflow.com/a/66605579/1425717) it shows about 6TiB in the last 4 hours. Why this difference? i guess the last amount it is what i am charged for – tito.icreativos Jan 11 '22 at 15:50
  • If you see 'in the last 4 hours' that implies to me maybe it's bandwidth you're looking at and not size on disk? Another possibility - and this is a complete guess, but maybe if you have many small files there is a minimum block size that each file can be on disk and that larger size would be what you're charged for. – Simon_Weaver Jan 13 '22 at 21:39
5

Here is a .net core script I use to list storage account usage using the average metrics value of the last hour.

using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using Microsoft.Azure.Management.CosmosDB.Fluent.Models;
using Microsoft.Azure.Management.Fluent;
using Microsoft.Azure.Management.Monitor;
using Microsoft.Azure.Management.Monitor.Models;
using Microsoft.Rest.Azure.Authentication;

namespace storagelist
{
    class Program
    {
        static async System.Threading.Tasks.Task Main(string[] args)
        {
            // to generate my.azureauth file run the follow command:
            // az ad sp create-for-rbac --sdk-auth > my.azureauth
            var azure = Azure.Authenticate("my.azureauth").WithDefaultSubscription();

            var accounts = azure.StorageAccounts.List();
            // can get values from my.azureauth
            var tenantId = "";
            var clientId = "";
            var clientSecret = "";
            var serviceCreds = await ApplicationTokenProvider.LoginSilentAsync(tenantId, clientId, clientSecret);
            MonitorManagementClient readOnlyClient = new MonitorManagementClient(serviceCreds);

            var oneHour = System.TimeSpan.FromHours(1);
            var startDate = DateTime.Now.AddHours(-oneHour.Hours).ToUniversalTime().ToString("o");
            string endDate = DateTime.Now.ToUniversalTime().ToString("o");
            string timeSpan = startDate + "/" + endDate;

            List<string> fileContents = new List<string>();

            foreach (var storage in accounts)
            {
                var response = await readOnlyClient.Metrics.ListAsync(
                resourceUri: storage.Id,
                timespan: timeSpan,
                interval: oneHour,
                metricnames: "UsedCapacity",

                aggregation: "Average",
                resultType: ResultType.Data,
                cancellationToken: CancellationToken.None);

                foreach (var metric in response.Value)
                {
                    foreach (var series in metric.Timeseries)
                    {
                        foreach (var point in series.Data)
                        {
                            if (point.Average.HasValue)
                            {
                                fileContents.Add($"{storage.Id}, {point.Average.Value}");
                                break;
                            }
                        }
                        break;
                    }
                    break;
                }
            }

            await File.WriteAllLinesAsync("./storage.csv", fileContents);
        }
    }
}
Red Riding Hood
  • 1,932
  • 1
  • 17
  • 36
3

This will give storage account capacity in each resource group in all subscriptions

$sub = Get-AzSubscription | select Name
$sub | foreach { 
Set-AzContext -Subscription $_.Name
$currentSub = $_.Name
$RGs = Get-AzResourceGroup | select ResourceGroupName
$RGs | foreach {
$CurrentRG = $_.ResourceGroupName
$StorageAccounts = Get-AzStorageAccount -ResourceGroupName $CurrentRG | select StorageAccountName
$StorageAccounts | foreach {
$StorageAccount = $_.StorageAccountName
$CurrentSAID = (Get-AzStorageAccount -ResourceGroupName $CurrentRG -AccountName $StorageAccount).Id
$usedCapacity = (Get-AzMetric -ResourceId $CurrentSAID -MetricName "UsedCapacity").Data
$usedCapacityInMB = $usedCapacity.Average / 1024 / 1024
"$StorageAccount,$usedCapacityInMB,$CurrentRG,$currentSub" >> ".\storageAccountsUsedCapacity.csv"
}
}
}

Output

shankar k
  • 51
  • 3
2

You can go to: Home > {storage account} > {container} > properties Under properties you will have calculate size Container Size

1

using Cloud Shell is one of the best solution so far:

  • Add a powershell file in Cloud Shell (check out the code below)
  • Run the PS command with Storage-Account and Resource-Group names

Output

Code

param($resourceGroup, $storageAccountName)


# usage
# Get-StorageAccountSize -resourceGroup <resource-group> -storageAccountName <storage-account-name>


# Connect to Azure
Connect-AzureRmAccount


# Get a reference to the storage account and the context
$storageAccount = Get-AzureRmStorageAccount `
-ResourceGroupName $resourceGroup `
-Name $storageAccountName
$ctx = $storageAccount.Context

# Get All Blob Containers
$AllContainers = Get-AzureStorageContainer -Context $ctx
$AllContainersCount = $AllContainers.Count
Write-Host "We found '$($AllContainersCount)' containers. Processing size for each one"

# Zero counters
$TotalLength = 0
$TotalContainers = 0

# Loop to go over each container and calculate size
Foreach ($Container in $AllContainers){
$TotalContainers = $TotalContainers + 1
Write-Host "Processing Container '$($TotalContainers)'/'$($AllContainersCount)'"
$listOfBLobs = Get-AzureStorageBlob -Container $Container.Name -Context $ctx

# zero out our total
$length = 0

# this loops through the list of blobs and retrieves the length for each blob and adds it to the total
$listOfBlobs | ForEach-Object {$length = $length + $_.Length}
$TotalLength = $TotalLength + $length
}
# end container loop

#Convert length to GB
$TotalLengthGB = $TotalLength /1024 /1024 /1024

# Result output
Write-Host "Total Length = " $TotallengthGB "GB"

https://gist.github.com/iamsunny/8718fb29146363af11da95e5eb82f245

Sunny Sharma
  • 4,688
  • 5
  • 35
  • 73
1

The Portal Storage Browser will show you the total data stored

The Portal Storage Browser will show you the total data stored.

  1. Login in to Azure.

  2. Navigate to the Storage Account.

  3. Click on Storage Browser on the left.

Greg Gum
  • 33,478
  • 39
  • 162
  • 233
0

To get this in Powershell, it is kind of a pain, but might be useful for other folks (such as cleaning out old backups): Here's what I came up with, and it should work with AzureRM module 6.13.0 at least:

$azstorcontext = New-AzureStorageContext -StorageAccountName storageaccounthere -StorageAccountKey storageaccountkeyhere
$sizesOverall = @()
$containers = Get-AzureStorageContainer -Context $azstorcontext
foreach ($container in $containers)
{
    Write-Output $container.Name
    $contblobs = get-azurestorageblob -container $container.name -Context $azstorcontext
    Write-Output "  Blobs: $($contblobs.count)"
    $containersize = ($contblobs | Measure-Object -Sum Length).Sum
    Write-Output "    Container Size: $containersize) (bytes)"
    $sizesOverall    
}
  • The only problem with that script is that you get all the blobs from the SA containers, and when you do that in multiple SAs with blobs of dozens of terabytes, the script hangs for hours and sometimes it times-out. – silverbackbg Jun 15 '20 at 16:08
0

I have created python script to calculate used storage in all subscriptions. Well, it''s not quickly:

  • need to request all subscriptions via provided permissions
  • request Azure resource Graph to receive list /subscription/resourcegroup/storageaccount
  • generate list of subscription, where storageaccount exists
  • request Azure Monitor for every /subscription/resourcegroup/storageaccount to receive UsedCapacity
    from azure.mgmt.monitor import MonitorManagementClient
    from azure.mgmt.subscription import SubscriptionClient
    from msrestazure.azure_active_directory import ServicePrincipalCredentials
    from azure.mgmt.resourcegraph import ResourceGraphClient
    from azure.mgmt.resourcegraph.models import QueryRequest
    
    credentials = ServicePrincipalCredentials(client_id, secret, tenant=tenant_id)
    
    sub_object = SubscriptionClient(credentials)
    rgraph_object = ResourceGraphClient(credentials)
    storageaccount_pattern = "resources | where type == 'microsoft.storage/storageaccounts' | project id"
    
    subs = [sub.as_dict() for sub in sub_object.subscriptions.list()]
    
    subs_list = []
    for sub in subs:
        subs_list.append(sub.get('subscription_id'))
    
    request_storageaccount = QueryRequest(subscriptions=subs_list, query=storageaccount_pattern)
    rgraph_storageaccount = rgraph_object.resources(request_storageaccount).as_dict()
    
    resource_ids = []
    
    for element in rgraph_storageaccount['data']:
        resource_ids.append(element['id'])
    
    count_used_storage = 0
    for resource_id in resource_ids:
        sub = (resource_id.split('/'))[2]
        monitor_object = MonitorManagementClient(credentials, subscription_id=sub)
        metrics_data = monitor_object.metrics.list(resource_id)
    
        for item in metrics_data.value:
            for timeserie in item.timeseries:
                for data in timeserie.data:
                    try:
                        count_used_storage = count_used_storage + data.average
                    except:
                        pass
    
    print(count_used_storage)

For ~400 subscriptions, ~1100 storageaccounts script works about 600 secs.

For one subscription it's much faster :)

Donets
  • 63
  • 10