0

I've a problem in defining multiple output for a blobtrigger case in Azure using python. How can I set all of the tries of WhileLoop into Outputblob? Now, I just get the last entity (replace the previous ones). I put the code for --init--.py:

import logging
import json
import azure.functions as func
from urllib.request import urlopen

#from azure.storage import blob

def main(inputblob: func.InputStream,
        outputblob: func.Out[func.InputStream]):
    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {inputblob.name}\n"
                 f"Blob Size: {inputblob.length} bytes")
    
    blob=  inputblob.read()
    blob_invoices= json.loads(blob)
    Number_of_Invoices=int(len(blob_invoices))
    j=0
    while j < Number_of_Invoices:
        Invoice_url= blob_invoices[j]["invoiceFile"]['url']
        invoice_opening= urlopen(Invoice_url)
              
        invoice_content= invoice_opening.read()
        outputblob.set(invoice_content)

if __name__ == "__main__":
    main()

and function.json:

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "inputblob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "data4test1/{name}.json",
      "connection": "data4test_STORAGE"
    },
    {
      "name": "outputblob",
      "type": "blob",
      "direction": "out",
      "path": "data4test2/{name}.xml",
      "connection": "data4test_STORAGE"
    }

  ],
  "disabled": false
Kave
  • 1
  • 2

1 Answers1

0

Below init sample code will helps you to set multiple output bindings from Azure function blob trigger using python:

import logging
import azure.functions as func


def main(inputblob: func.InputStream,
        outputblob1: func.Out[func.InputStream],
        outputblob2: func.Out[func.InputStream],
        outputblob3: func.Out[func.InputStream]):
    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {inputblob.name}\n"
                 f"Blob Size: {inputblob.length} bytes")
    outputblob1.set(inputblob)
    outputblob2.set(inputblob)
    outputblob3.set(inputblob)

function.json file:

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "inputblob",
      "type": "blobTrigger",
      "direction": "in",
      "dataType": "binary",
      "path": "blobcontainer/{name}",
      "connection": "AzureWebJobsStorage"
    },
    {
      "name": "outputblob1",
      "type": "blob",
      "path": "uploadblobcontainer/{blobtrigger}-first",
      "dataType": "binary",
      "connection": "AzureWebJobsStorage",
      "direction": "out"
    },
    {
      "name": "outputblob2",
      "type": "blob",
      "path": "uploadblobcontainer/{blobtrigger}-second",
      "dataType": "binary",
      "connection": "AzureWebJobsStorage",
      "direction": "out"
    },
    {
      "name": "outputblob3",
      "type": "blob",
      "path": "uploadblobcontainer/{blobtrigger}-third",
      "dataType": "binary",
      "connection": "AzureWebJobsStorage",
      "direction": "out"
    }
  ]
}

Updated Answer

The list_blobs method is used to list the blobs in a container. A generator is returned by this procedure. The name of each blob in a container is output to the console by the code that follows.

generator = block_blob_service.list_blobs('mycontainer')
for blob in generator:
    print(blob.name)

Similarly, define your required functionality in the for loop.

Delliganesh Sevanesan
  • 4,146
  • 1
  • 5
  • 15
  • Many thanks, But what if we don't know how many outputs we have? In my case, it would be from 1 file into 100 maybe. I think we need a loop. Think so? – Kave Oct 28 '21 at 08:10
  • Hi @Kave, I think there are no limits for the number of blob containers, blobs, entities, queues, tables, file shares, or messages. But we have limits to blob storage size and each block size inside the blob, depends on type of blob. Please refer this [Azure storage limits at a glance](https://cloud.netapp.com/blog/azure-anf-blg-azure-storage-limits-at-a-glance) for more information. –  Oct 28 '21 at 08:28
  • Thanks @HariKrishnaRajoli-MT I think so, there is no limit. And the blob size is not a major issue (they're small files). But the problem is, I can't determine how many blobs would be created for output. I'm wondering how can we introduce a loop, instead of defining so many outputblobs. – Kave Oct 28 '21 at 09:35
  • @Kave, please open another thread on this question (how can we introduce a loop, instead of defining so many outputblobs) in SO –  Oct 28 '21 at 10:00
  • @KitKat99 Do you have any idea? I see you had a similar case and managed to solve it. – Kave Nov 04 '21 at 10:37
  • Many thanks for the time. But I think it suits the cases with many blobs in a container. In my case, we have one JSON file that contains unknown Url (link to an XML file). Anyway, I try to check this approach, as well. Now, I'm trying to use `return` instead of `.set(outputblob)`. In this way, we can have many outputs: [Using the Azure Function return value](https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-return-value?tabs=python) – Kave Nov 08 '21 at 10:11