5

I want to do the same functionality (with few changes based on message data) from two different eventhubs.

Is it possible to attach two consumer group to a single function.

It did not work even though I add it to function.json.

halfer
  • 19,824
  • 17
  • 99
  • 186
Kannaiyan
  • 12,554
  • 3
  • 44
  • 83

3 Answers3

5

The short answer is no. You cannot bind multiple input triggers to the same function: https://github.com/Azure/azure-webjobs-sdk-script/wiki/function.json

A function can only have a single trigger binding, and can have multiple input/output bindings.

However, you can call the same "shared" code from multiple functions by either wrapping the shared code in a helper method, or using Precompiled Functions.

Pete M
  • 2,008
  • 11
  • 17
  • Created copy of the same function and triggered with a different trigger. It obviously increase junk copy of same functions code to azure, if Azure does not address it on time. It is not just copy, using unnecessary activities related to this. – Kannaiyan Oct 20 '17 at 17:14
1

Recommended practice here is to share business logic between functions by using the fact that a single function app can be composed of multiple functions.

MyFunctionApp
|     host.json
|____ business
|     |____ logic.js
|____ function1
|     |____ index.js
|     |____ function.json
|____ function2
      |____ index.js
      |____ function.json

In "function1/index.js" and "function2/index.js"

var logic = require("../business/logic");

module.exports = logic;

The function.json of function1 and function2 can be configured to different triggers.

In "business/logic.js

module.exports = function (context, req) {
    // This is where shared code goes. As an example, for an HTTP trigger:
    context.res = {
        body: "<b>Hello World</b>",
        status: 201,
        headers: {
            'content-type': "text/html"
        }
    };     
    context.done();
};
Marie Hoeger
  • 1,261
  • 9
  • 11
  • 1
    These are all hacks. Function App with the infrastructure should allow multiple input triggers. AWS has this functionality already. I can trigger via a stream or queue or sns or http while the code remains the same. – Kannaiyan May 08 '18 at 06:44
  • I think this is a complicated question (as this AWS forum post demonstrates: https://forums.aws.amazon.com/message.jspa?messageID=703674). What is your use case for wanting multiple triggers? If you do want to trigger the same code via a stream or queue or http etc., I'd imagine that the code coming from different trigger types would have to have some sort of conditional logic on input type which feels a bit unclean to me. – Marie Hoeger May 08 '18 at 18:48
  • Could you also tell me more about what you mean by "Function App with the infrastructure should allow multiple input triggers"? A Function App allows for multiple input triggers, but each trigger must be defined separately. I do agree that using shared code outside of the immediate index.js is more of an advanced scenario that can be confusing for Portal-only developers, but should be relatively intuitive from local development. – Marie Hoeger May 08 '18 at 18:52
0

Is it possible to attach two consumer group to a single function.

Assuming you're looking for trigger and don't want to do your own polling using EventProcessorClient inside your function. Because you can schedule a function to periodically use API to fetch messages from multiple event hubs and process them. But you need to implement all the built-in logic (polling, handling multiple partitions, check-pointing, scaling, ...) you get when you use triggers.

Couple of work arounds:

  1. Capture: If event hubs are in same namespace, you can enable capture on all your event hubs. Then create an event grid trigger for your function. You'll get a message with path of capture file. E.g.
{
    "topic": "/subscriptions/9fac-4e71-9e6b-c0fa7b159e78/resourcegroups/kash-test-01/providers/Microsoft.EventHub/namespaces/eh-ns",
    "subject": "eh-1",
    "eventType": "Microsoft.EventHub.CaptureFileCreated",
    "id": "b5aa3f62-15a1-497a-b97b-e688d4368db8",
    "data": {
        "fileUrl": "https://xxx.blob.core.windows.net/capture-fs/eh-ns/eh-1/0/2020/10/28/21/39/01.avro",
        "fileType": "AzureBlockBlob",
        "partitionId": "0",
        "sizeInBytes": 8011,
        "eventCount": 5,
        "firstSequenceNumber": 5,
        "lastSequenceNumber": 9,
        "firstEnqueueTime": "2020-10-28T21:40:28.83Z",
        "lastEnqueueTime": "2020-10-28T21:40:28.908Z"
    },
    "dataVersion": "1",
    "metadataVersion": "1",
    "eventTime": "2020-10-28T21:41:02.2472744Z"
}

Obviously this is not real-time, min capture time you can set is 1 minute and there might be a little delay between the time captured avro file is written and your function is invoked.

  1. At least in Java there is no restriction that you must have a separate class for each function. So you can do this:
public class EhConsumerFunctions {
    private void processEvent(String event) {
        // process...
    }

    @FunctionName("eh1_consumer")
    public void eh1_consumer(
        @EventHubTrigger(name = "event", eventHubName = "eh-ns", connection = "EH1_CONN_STR") String event,
        final ExecutionContext context) {
        processEvent(event);
    }

    @FunctionName("eh2_consumer")
    public void eh2_consumer(
        @EventHubTrigger(name = "event", eventHubName = "eh-ns", connection = "EH2_CONN_STR") String event,
        final ExecutionContext context) {
        processEvent(event);
    }

}

and define EH1_CONN_STR and EH2_CONN_STR in your app settings.

Kashyap
  • 15,354
  • 13
  • 64
  • 103