Azure Storage accounts are made up of 4 different types of storage: blobs, fileshares, queues and tables. Different methods and/or strategies will be required to copy each type to a new storage account.
There are many tools you could use to automate the process including the Az PowerShell module, the az cli and the Azure REST API. Not all tools provide methods for working with all 4 types of storage.
First you will need to create a new storage account to copy your data to. Be aware that account names are globally unique not just unique within your tenant, are limited to 24 characters in length and can only be made up of lowercase letters and numbers. You could use the New-AzStorageAccount
PowerShell cmdlet to create the new account.
Blobs
Blob storage is made up of containers that hold blobs in a flat file structure. You will need to recreate each container from your source account, this could be done using the PowerShell cmdlet New-AzStorageContainer
. Once the containers have been created you can use Start-AzStorageBlobCopy
to copy the blobs from the container in your source account to the container in your new account.
Fileshares
The storage account can contain multiple fileshares each containing a nested folder structure of varying depth. You can create a new file share in your destination account using New-AzStorageShare
. Once you've recreated your fileshares you'll need to recurse through the folder structure in each share to get all the files. You can then use New-AzStorageDirectory
to recreate the folder structure and Start-AzStorageFileCopy
to copy the files into the new fileshare.
Queues
As queues use a publisher subscriber model and the data is transient it may be easiest to recreate the queue and use a variation on the method your current publisher uses to populate the new queue with data. You can use New-AzStorageQueue
to create the new queue. Alternatively you could create the new queue and repoint the publisher to it and only repoint the subscripers once the old queue is drained. For your use case the first approach may be more suitable.
Tables
The storage account can contain multiple tables each containing multiple rows of data. You can use New-AzStorageTable
to recreate the tables but this will not copy over the data they contain. Unfortunately there isn't a cmdlet in the Az module to do this but the AzTable module contains the Get-AzTableRow
and Add-AzTableRow
cmdlets which should allow you to copy the rows to the new table.
Summary
In practice implementing all this will require quite a lengthy script which will only grow if you need the process to handle large volumes of data and handle errors to ensure an accurate copy. I've developed a script that handles blobs and fileshares which was sufficiently robust and quick enough for the hobby project I needed it for. However, it took several hours to copy around 10 accounts the largest of which contained less than 1Gb of data so probably won't scale well to a commercial environment. The script can be found here if you wish to use it as a starting point.