I have an azure function with blob trigger. Function should trigger once a day but it works sometimes. I found that is function is not awake when file arrives to path used as trigger, it doesn't launch and file is not processed. If I go to portal and refresh the function, it start working and processes all queued files.
Is there a way to make the function triggers without "refreshing" it?
Function code is written in python and it is created in azure using azure devops pipeline.
I attach host.json
configuration for more details:
{
"version": "2.0",
"logging": {
"fileLoggingMode": "always",
"applicationInsights": {
"samplingSettings": {
"isEnabled": true
}
}
},
"extensions": {
"queues": {
"maxPollingInterval": "00:00:02",
"visibilityTimeout" : "00:00:30",
"batchSize": 8,
"maxDequeueCount": 5,
"newBatchThreshold": 4,
"messageEncoding": "base64"
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[3.3.0, 4.0.0)"
},
"functionTimeout": "-1",
"retry": {
"strategy": "fixedDelay",
"maxRetryCount": 0,
"delayInterval": "00:00:05"
}
}
Below you can find function.json file with trigger details
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "inStr",
"type": "blobTrigger",
"direction": "in",
"path": "container/path_to_file/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
And this is part of the code of python function, file init.py
def main(in_str):
uri = in_str.uri
split_uri = uri.replace("//", "/").split("/")
source_bucket = split_uri[2]
source_key = '/'.join(split_uri[3:])
source_directory = '/'.join(split_uri[3:-1])
file_name = source_key.split('/')[-1]
print(f"File name is {file_name}")