0

I'm using Capture event to retrieve event from Eventhub to storage container. but In capture event I chosen blob file path format as

{Namespace}/{EventHub}/{PartitionId}/{Year}/{Month}/{Day}/{Hour}/{Minute}/{Second}

now how to give dynamic blob path in blog trigger.

i written as following

@FunctionName("AzureBlogTriggerFn5")
    public void blobHandler(
            @BlobTrigger(name = "content", path = "uts-blobcontainer-nb-dev/uts-eventhubns/uts-nb-eventhub/{partition}/{yyyy}/{MM}/{dd}/{HH}/{mm}/{ss}/{fileName}", dataType = "binary", connection = "AzureWebJobsStorage") byte[] content,
            @BindingName("fileName") String fileName,
            final ExecutionContext context
    ) throws StorageException, IOException, URISyntaxException, InvalidKeyException, InterruptedException {
        context.getLogger().info("Java Blob trigger function processed a blob. Name: " + fileName + "\n  Size: " + content.length + " Bytes");

i'm getting following error

**2019-12-19T15:05:52.576 [Error] Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.DecompressServiceFunctionNB'. System.Private.CoreLib: An item with the same key has already been added. Key: mm.**

please suggest me, how to give dynamic path for partitionId, year, month, day,hour,minutes,seconds in blogtrigger in Azure ?

1 Answers1

0

I believe the error is because of having both "MM" and "mm".

When using trigger blob name patterns, the keys can be any string. In your case, something like this should work instead

path = "uts-blobcontainer-nb-dev/uts-eventhubns/uts-nb-eventhub/{partitionId}/{year}/{month}/{day}/{hour}/{minute}/{second}/{fileName}"
And bind the ones you require in your function.
PramodValavala
  • 6,026
  • 1
  • 11
  • 30