I want to do the same functionality (with few changes based on message data) from two different eventhubs.
Is it possible to attach two consumer group to a single function.
It did not work even though I add it to function.json.
I want to do the same functionality (with few changes based on message data) from two different eventhubs.
Is it possible to attach two consumer group to a single function.
It did not work even though I add it to function.json.
The short answer is no. You cannot bind multiple input triggers to the same function: https://github.com/Azure/azure-webjobs-sdk-script/wiki/function.json
A function can only have a single trigger binding, and can have multiple input/output bindings.
However, you can call the same "shared" code from multiple functions by either wrapping the shared code in a helper method, or using Precompiled Functions.
Recommended practice here is to share business logic between functions by using the fact that a single function app can be composed of multiple functions.
MyFunctionApp
| host.json
|____ business
| |____ logic.js
|____ function1
| |____ index.js
| |____ function.json
|____ function2
|____ index.js
|____ function.json
In "function1/index.js" and "function2/index.js"
var logic = require("../business/logic");
module.exports = logic;
The function.json of function1 and function2 can be configured to different triggers.
In "business/logic.js
module.exports = function (context, req) {
// This is where shared code goes. As an example, for an HTTP trigger:
context.res = {
body: "<b>Hello World</b>",
status: 201,
headers: {
'content-type': "text/html"
}
};
context.done();
};
Is it possible to attach two consumer group to a single function.
Assuming you're looking for trigger and don't want to do your own polling using EventProcessorClient inside your function. Because you can schedule a function to periodically use API to fetch messages from multiple event hubs and process them. But you need to implement all the built-in logic (polling, handling multiple partitions, check-pointing, scaling, ...) you get when you use triggers.
Couple of work arounds:
{
"topic": "/subscriptions/9fac-4e71-9e6b-c0fa7b159e78/resourcegroups/kash-test-01/providers/Microsoft.EventHub/namespaces/eh-ns",
"subject": "eh-1",
"eventType": "Microsoft.EventHub.CaptureFileCreated",
"id": "b5aa3f62-15a1-497a-b97b-e688d4368db8",
"data": {
"fileUrl": "https://xxx.blob.core.windows.net/capture-fs/eh-ns/eh-1/0/2020/10/28/21/39/01.avro",
"fileType": "AzureBlockBlob",
"partitionId": "0",
"sizeInBytes": 8011,
"eventCount": 5,
"firstSequenceNumber": 5,
"lastSequenceNumber": 9,
"firstEnqueueTime": "2020-10-28T21:40:28.83Z",
"lastEnqueueTime": "2020-10-28T21:40:28.908Z"
},
"dataVersion": "1",
"metadataVersion": "1",
"eventTime": "2020-10-28T21:41:02.2472744Z"
}
Obviously this is not real-time, min capture time you can set is 1 minute and there might be a little delay between the time captured avro file is written and your function is invoked.
public class EhConsumerFunctions {
private void processEvent(String event) {
// process...
}
@FunctionName("eh1_consumer")
public void eh1_consumer(
@EventHubTrigger(name = "event", eventHubName = "eh-ns", connection = "EH1_CONN_STR") String event,
final ExecutionContext context) {
processEvent(event);
}
@FunctionName("eh2_consumer")
public void eh2_consumer(
@EventHubTrigger(name = "event", eventHubName = "eh-ns", connection = "EH2_CONN_STR") String event,
final ExecutionContext context) {
processEvent(event);
}
}
and define EH1_CONN_STR
and EH2_CONN_STR
in your app settings.