1

I'm relatively new to confluent-kafka python and I am having an hard time figuring out how to send the logs from the kafka library to the program logger.

I am using a SerializingProducer and, according to the docs here: https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#serde-producer I should be able to pass a logger as config entry.

Something like:

logger = logging.getLogger()
producer_config = {
    "bootstrap.servers" : "https....",
    ...,
    "logger" : logger
}

However this does not seem to work as intended. The only log visible is the one producer directly by the program. Removing the line

"logger" : logger

allows the kafka library to correctly show the logs in the console.

My producer looks as follow:

log = logging.getLogger(__name__)


class MyProducer(confluent_kafka.SerializingProducer):

    def __init__(self, topic, producer_config, source_folder) -> None:
        producer_config["logger"] = log

        super(MyProducer, self).__init__(producer_config)
        log.info(f"Producer initialized with the following conf::\n{producer_config}")

Any idea on why it happens and how to fix this? Other information available:

  • kafka library logs to stderr
  • passing directly an Handler rather than a Logger does not fix the problem, but rather duplicates the output of the program

Thanks in advance.

fedmag
  • 55
  • 7

0 Answers0