1

I am using Node.js and the Azure SDK v12. I want to copy an existing blob with access tier==='Archive. To do so, I want to copy the blob and write it to the same container with a different blob name and a changed (rehydrated) access tier.

I could change the access tier of the existing 'Archived" blob directly, but that is not my goal. I want to keep the blob with access tier "Archive" and create a new blob with access tier==="Cool" || "Hot".

I am proceeding as per the documentation (https://learn.microsoft.com/en-us/azure/storage/blobs/archive-rehydrate-overview).

The below code works if the blob has access tier==='Cool' || 'Hot'. It fails for blobs with access tier==='Archive', though.

Aside: I think for SDK 'syncCopyFromUrl' and 'beginCopyFromUrl' do not work for copying blobs with access tier==='Archive'. I get the following errors if I try that: for 'syncCopyFromUrl' it gives me: "This operation is not permitted on an archived blob." For 'beginCopyFromUrl" it gives me: "Copy source blob has been modified" - when I check, the blob has not been modified (I check the last modification date and it is in the past).

How do I copy the archived blob and save a new blob in the same container with a different access type

const { BlobServiceClient,generateBlobSASQueryParameters, BlobSASPermissions } = require("@azure/storage-blob");

export default async (req, res) => {
    if (req.method === 'POST') {
      
      const connectionString = 'DefaultEndpointsProtocol=...'
      const containerName = 'container';

      const srcFile='filename' // this is the filename as it appears on Azure portal (i.e. the blob name)
      
      async function getSignedUrl(blobClient, options={}){

          options.permissions = options.permissions || "racwd"
          const expiry = 3600;
          const startsOn = new Date();
          const expiresOn = new Date(new Date().valueOf() + expiry * 1000);
        
        
          const token = await generateBlobSASQueryParameters(
              {
                  containerName: blobClient.containerName,
                  blobName: blobClient.name,
                  permissions: BlobSASPermissions.parse(options.permissions),
                  startsOn, // Required
                  expiresOn, // Optional
              },
              blobClient.credential,
          );
      
          return `${blobClient.url}?${token.toString()}`;
      }
      
      (async () => {
          try {
              const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
              const containerClient = blobServiceClient.getContainerClient(containerName);
              const sourceBlobClient = containerClient.getBlockBlobClient(srcFile);
              const targetBlobClient = containerClient.getBlockBlobClient('targetFileName');
      
              const url = await getSignedUrl(sourceBlobClient);
              console.log(`source: ${url}`);
              
              const result = await targetBlobClient.syncCopyFromURL(url);
              // const result = await targetBlobClient.beginCopyFromURL(url);
              
              console.log(result)
          } catch (e) {
              console.log(e);
          }
      })();
    }
}

export const config = {
    api: {
      bodyParser: {
        sizeLimit: '1gb',
      },
    },
}
Andres R
  • 120
  • 1
  • 5

1 Answers1

0

The main step which we need to know is to change the access tier of the blob.

With the below code we can set the access tier from JS:

  // Archive the blob - Log the error codes
      await blockBlobClient.setAccessTier("Archive");
      try {
        // Downloading an archived blockBlob fails
        console.log("// Downloading an archived blockBlob fails...");
        await blockBlobClient.download();
      } catch (err) {
        // BlobArchived Conflict (409)  This operation is not permitted on an archived blob.
        console.log(
          `requestId - ${err.details.requestId}, statusCode - ${err.statusCode}, errorCode - ${err.details.errorCode}`
        );
        console.log(`error message - ${err.details.message}\n`);
      }

And the rest of the operation can be done with help of copy events which are as below:

import logging
import sys
import os
import azure.functions as func
from azure.storage import blob
from azure.storage.blob import BlobServiceClient, BlobClient, ContainerClient, __version__

def main(myblob: func.InputStream):
    try:
        logging.info(f"Python blob trigger function processed blob \n")
        CONN_STR = "ADD_CON_STR"
        blob_service_client = BlobServiceClient.from_connection_string(CONN_STR)

        # MAP SOURCE FILE
        blob_client = blob_service_client.get_blob_client(container="newcontainer0805", blob="source.txt")

        #SOURCE CONTENTS
        content =  blob_client.download_blob().content_as_text
        
        # WRITE HEADER TO A OUT PUTFILE
        output_file_dest = blob_service_client.get_blob_client(container="target", blob="target.csv")
        
        #INITIALIZE OUTPUT               
        output_str = ""
        
        #STORE COULMN HEADERS
        data= list()
            
        data.append(list(["column1", "column2", "column3", "column4"]))
        output_str += ('"' + '","'.join(data[0]) + '"\n')
        output_file_dest.upload_blob(output_str,overwrite=True)
        logging.info(' END OF FILE UPLOAD')
    except Exception as e:
        template = "An exception of type {0} occurred. Arguments:\n{1!r}"
        message = template.format(type(e).__name__, e.args)
        print (message)
if __name__ == "__main__":
    main("source.txt")

This helps you to copy the blob and append the data to it, if you want to save the blob in same container modify destination as same container as source.

SaiKarri-MT
  • 1,174
  • 1
  • 3
  • 8
  • Thanks for the comment - I solved this problem using the REST API instead => https://stackoverflow.com/questions/69668784/authorization-of-azure-storage-service-rest-api-for-next-js-with-node-js-server – Andres R Nov 03 '21 at 16:09