I want to retrieve the size of an azure storage account without using the portal (metrics). How can I get the metrics of the storage account through azure CLI or bash script? Is there a way to make this through azure CLI or any bash scripts?
3 Answers
Looking at the AZ CLI commands, I believe there's no command currently available that will give you this information directly.
What you will need to do is make use of az rest
and invoke Metrics - List
REST API and parse the response.
Here's the command you would want to execute:
az rest --uri https://management.azure.com/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.Storage/storageAccounts/<storage-account-name>/providers/Microsoft.Insights/metrics?api-version=2018-01-01&metricnames=UsedCapacity&aggregation=Average
You would get a response like the following:
{
"cost": 59,
"interval": "PT1H",
"namespace": "Microsoft.Storage/storageAccounts",
"resourceregion": "resource-location",
"timespan": "2021-10-27T05:12:06Z/2021-10-27T06:12:06Z",
"value": [
{
"displayDescription": "The amount of storage used by the storage account. For standard storage accounts, it's the sum of capacity used by blob, table, file, and queue. For premium storage accounts and Blob storage accounts, it is the same as BlobCapacity or FileCapacity.",
"errorCode": "Success",
"id": "/subscriptions/<subscription-id>/resourceGroups/<resource-group-name>/providers/Microsoft.Storage/storageAccounts/<storage-account-name>/providers/Microsoft.Insights/metrics/UsedCapacity",
"name": {
"localizedValue": "Used capacity",
"value": "UsedCapacity"
},
"resourceGroup": "cerebrata",
"timeseries": [
{
"data": [
{
"average": 9078827149.0,//This is the value you would want to extract
"timeStamp": "2021-10-27T05:12:00Z"
}
],
"metadatavalues": []
}
],
"type": "Microsoft.Insights/metrics",
"unit": "Bytes"
}
]
}

- 128,066
- 12
- 206
- 241
-
Thanks, @Gaurav Manthri. This is the answer I expected. Btw can you explain what is by the value of "Average" in parameter Aggregation? And what are the other values that we can be used for the parameter Aggregation? – Sashin Sahasra Oct 28 '21 at 05:47
-
@SashinSahasra It means that the capacity value is averaged over the time interval which in the query is 1h - so you get the average capacity used for that one hour timespan. Other options available should be `Min` and `Max` and probably `Sum` (but that's unlikely to be useful). – ChrisWue Dec 16 '21 at 18:29
To Calculate size of storage account you need to find the size of containers in the storage account then sum the size you get the size of storage account.
Example to get length of container using Azure CLI
#!/bin/bash
export AZURE_STORAGE_ACCOUNT=<storage-account-name>
export AZURE_STORAGE_ACCESS_KEY=<storage-account-key>
# Create a resource group
az group create --name myResourceGroup --location eastus
# Create a container
az storage container create --name mycontainer
# Create sample files to upload as blobs
for i in `seq 1 3`; do
echo $RANDOM > container_size_sample_file_$i.txt
done
# Upload sample files to container
az storage blob upload-batch \
--pattern "container_size_sample_file_*.txt" \
--source . \
--destination mycontainer
# Calculate total size of container. Use the --query parameter to display only
# blob contentLength and output it in TSV format so only the values are
# returned. Then pipe the results to the paste and bc utilities to total the
# size of the blobs in the container.
bytes=`az storage blob list \
--container-name mycontainer \
--query "[*].[properties.contentLength]" \
--output tsv |
paste --serial --delimiters=+ | bc`
# Display total bytes
echo "Total bytes in container: $bytes"
# Delete the sample files created by this script
rm container_size_sample_file_*.txt
Refer this document for more details :
Example using PowerShell
Get-AzureStorageBlob -Container "ContainerName" | %{ $_.Length } | measure -Sum
For more details refer this SO Thread

- 1,622
- 1
- 4
- 9
-
1Few issues with this approach: 1) It will only give you the size of active blobs in a blob container. It will not include the blob snapshots or versions sizes in that container. 2) You will need to perform this operation for every blob container in your storage account which can be error prone. 3) It will not give you the total size of a storage account as it does not include the size of file, queue and table storage. – Gaurav Mantri Oct 27 '21 at 06:44
-
1If your account has millions or hundreds of millions of blobs this will not work practically. – ChrisWue Dec 16 '21 at 18:30
You can initiate a delete on the Azure Portal and it will tell you an approximate amount of data you have in each object. It will not delete until you confirm twice.
Scary much?

- 3,788
- 3
- 38
- 53