1

What would be the best way to copy a blob from one storage account to another storage account using @azure/storage-blob?

I would imagine using streams would be best instead of downloading and then uploading, but would like to know if the code below is the correct/optimal implementation for using streams.

const srcCredential = new ClientSecretCredential(<src-ten-id>, <src-client-id>, <src-secret>);
const destCredential = new ClientSecretCredential(<dest-ten-id>, <dest-client-id>, <dest-secret>);

const srcBlobClient = new BlobServiceClient(<source-blob-url>, srcCredential);
const destBlobClient = new BlobServiceClient(<dest-blob-url>, destCredential);

const sourceContainer = srcBlobClient.getContainerClient("src-container");
const destContainer = destBlobClient.getContainerClient("dest-container");

const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name)

// copy blob
await destBlob.uploadStream((await sourceBlob.download()).readableStreamBody);
yohaansunnie
  • 35
  • 1
  • 8

1 Answers1

1

Your current approach downloads the source blob and then re-uploads it which is not really optimal.

A better approach would be to make use of async copy blob. The method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions). You would need to create a Shared Access Signature URL on the source blob with at least Read permission. You can use generateBlobSASQueryParameters SDK method to create that.

const sourceBlob = sourceContainer.getBlockBlobClient("blob");
const destBlob = destContainer.getBlockBlobClient(sourceBlob.name);

const sourceBlobSasUrl = GenerateSasUrlWithReadPermissionOnSourceBlob(sourceBlob);
// copy blob
await destBlob.beginCopyFromURL(sourceBlobSasUrl);
Gaurav Mantri
  • 128,066
  • 12
  • 206
  • 241
  • That's weird. Can you tell me more about your setup? Please edit your question and include your latest code. Also, you should be able to dig deep into the error message and see which HTTP header is not in the correct format. Please share that as well. – Gaurav Mantri Mar 31 '21 at 10:40
  • 1
    Sorry, I hadn't added the blob URL part before the SAS. However, the issue I have now is that it throws an error saying "RestError: Public access is not permitted on this storage account". The storage account is configured to disable public blob access, but it is configured to allow SAS. 'Allow shared key access' is enabled. Will this only work if the public access is enabled? – yohaansunnie Mar 31 '21 at 10:54
  • Please see this thread: https://learn.microsoft.com/en-us/answers/questions/40660/azure-storage-account-blob-service-sas-connectivit.html and this: https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-prevent. It seems for this solution to work, public access should be enabled. – Gaurav Mantri Mar 31 '21 at 11:00
  • 1
    It worked!! Sorry, it was the URL again (missed the '?' before SAS query parameter), which was why the public access error came up thinking it was simply a URL. You don't need to have the 'Public Access' enabled on the storage account, just need the SAS access enabled. I was able to perform a copy on a large file and it completed really fast. Thank you for your answer!! – yohaansunnie Mar 31 '21 at 11:16