0

In the Azure function, how do we make the Kafka producer connection as a singleton or connection pooling. Each time function is triggered a new Kafka connection is being created.

import json
import logging
import os, time
import azure.functions as func
from confluent_kafka import Producer

def main(event: func.EventGridEvent):

    ##kafka configuration
    conf = {
    'bootstrap.servers': os.getenv('AZURE_EVENT_HUB_SERVER') + ':9093',
    'security.protocol': 'SASL_SSL',
    'sasl.mechanism': 'PLAIN',
    'sasl.username': '$ConnectionString',
    'sasl.password': os.getenv('AZURE_EVENT_HUB_CONN_STRING')
    }


    data = event.get_json()
    topic = "events"

    p = Producer(**conf)

    if topic is not None:
        try:
            p.produce(topic, key=data.id, value=data)
            # logging.info('Producing message %s', file_path)
        except BufferError:
            logging.error('%% Local producer queue is full (%d messages awaiting delivery): try again\n', len(p))
        # Wait until all messages have been delivered
        p.flush()

        logging.info(f'Sucessfully completed the processing for the event: {event.get_json()}')
    else:
        logging.error(f'Failed')
  • 1
    What happens if you create the connection in global scope? i.e. outside of the `main` function? – C.Nivs Dec 08 '21 at 06:34
  • I tried that, defining a function outside and using the returned kafka connection. But it was too slow while handling multiple requests. So want to make sure is there a standard way available in python connection pooling. – KARTHICK JOTHIMANI Feb 14 '22 at 08:18

0 Answers0