1

I have azure blob trigger function that reads xml files.

def main(myblob: func.blob.InputStream):

    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {myblob.name}\n")
    data = myblob.read()
    logging.info(data)
    data = ET.fromstring(data)


    return

Note the function was created using the vscode feature

I get an error that the data is malformed when the data is goes through the XML parser

When I check the Azure Insights Logs for the Azure Function, it seems that it does not fully read the xml file.

The XML file is around 150KB around 4000 lines. Sorry cannot give an example as it exceeds character limit here.

It looks like it truncates the file output.

Is there anyway to fully read the blob file?

Thanks

Yanick
  • 11
  • 1

1 Answers1

0

Its late but I will still write as it might help others.

There is another option available to read files i.e. we can create a url with SAS for file when it is triggered with blob and then you can use that url to fetch file from blob storage.

here is an example:-

containerSas = generate_container_sas(
    account_name = os.environ['remoteStorageAccountName'],
    account_key = os.environ['remoteStorageAccountKey'],
    container_name = os.environ['remoteStorageInputContainer'],
    permission = ContainerSasPermissions(read=True),
    expiry=datetime.utcnow() + timedelta(hours=1)
)

url = 'https://'+os.environ['remoteStorageAccountName']+'.blob.core.windows.net/' + blobName + "?" + containerSas

you can use url to get the file. I hope this helps.