0

I am trying to configure SSL in Kafka (Installed locally on my Windows). I am using confluent-kafka python client. Most of the solutions are for java and it involves creating some truststore, keystore and jass configuration which is not clearly understandable. Also what all changes do I have to make in the properties files (server/producer/consumer) is also not clear.

Here's producer.py

# ./producer.py

import certifi
from confluent_kafka import Producer

if __name__ == '__main__':

    topic = "demo_ssl"
    conf = {
        'bootstrap.servers': "localhost:9093",
        'security.protocol': 'SASL_SSL',

        'ssl.ca.location': certifi.where(),

        'sasl.mechanism': 'PLAIN',
        'sasl.username': 'user',
        'sasl.password':'password',
    }

    # Create Producer instance
    producer = Producer(**conf)
    delivered_records = 0

    # on_delivery handler (triggered by poll() or flush())
    # when a message has been successfully delivered or permanently failed delivery after retries.
    def acked(err, msg):
        global delivered_records
        """Delivery report handler called on
        successful or failed delivery of message"""
        if err is not None:
            print("Failed to deliver message: {}".format(err))
        else:
            delivered_records += 1
            print("Produced record to topic {} partition [{}] @ offset {}".format(msg.topic(), msg.partition(), msg.offset()))

    
    for n in range(10):
        record_key = "messageKey" + str(n)
        record_value = "messageValue" + str(n)
        print("Producing record: {}\t{}".format(record_key, record_value))
        producer.produce(topic, key=record_key, value=record_value, on_delivery=acked)
        # p.poll() serves delivery reports (on_delivery) from previous producer() calls.
        producer.poll(0)

    
    producer.flush()
    print("{} messages were producerd to topic {}!".format(delivered_records, topic))

Here's consumer.py

# .\consumer.py

import certifi
from confluent_kafka import Consumer


if __name__ == '__main__':

    topic = 'demo_ssl'
    conf = {
        'bootstrap.servers': 'localhost:9093',
        'security.protocol': 'SASL_SSL',
        'group.id': 'group_ssl',
        'ssl.ca.location': certifi.where(),

        'sasl.mechanism': 'PLAIN',
        'sasl.username': 'user',
        'sasl.password': 'password',
    }

    # Create Consumer instance
    consumer = Consumer(conf)

    # Subscribe to topic
    consumer.subscribe([topic])

    # Process messages
    try:
        while True:
            msg = consumer.poll(1.0)
            if msg is None:
                # No message available within timeout.
                # Initial message consumption may take up to 
                # `session.timeout.ms` for the consumer group to
                # rebalance and start consuming
                print("Waiting for message or event/error in poll()")
                continue
            elif msg.error():
                print('error: {}'.format(msg.error()))
            else:
                # Check for Kafka message
                record_key = "Null" if msg.key() is None else msg.key().decode('utf-8')
                record_value = msg.value().decode('utf-8')
                print("Consumed record with key " + record_key + " and value " + record_value)
    except KeyboardInterrupt:
        pass
    finally:
        print("Leave group and commit final offsets")
        consumer.close()

I have not changed any server configuration in server.properties or consumer/producer.properties. When I am running my consumer I am getting this error:

Waiting for message or event/error in poll()
Waiting for message or event/error in poll()
%3|1641896886.032|FAIL|rdkafka#consumer-1| [thrd:sasl_ssl://localhost:9093/bootstrap]: sasl_ssl://localhost:9093/bootstrap: Connect to ipv6#[::1]:9093 failed: Unknown error (after 2048ms in state CONNECT)

and for producer.py:

Producing record: messageKey0   messageValue0
Producing record: messageKey1   messageValue1
Producing record: messageKey2   messageValue2
Producing record: messageKey3   messageValue3
Producing record: messageKey4   messageValue4
Producing record: messageKey5   messageValue5
Producing record: messageKey6   messageValue6
Producing record: messageKey7   messageValue7
Producing record: messageKey8   messageValue8
Producing record: messageKey9   messageValue9
%3|1641897033.087|FAIL|rdkafka#producer-1| [thrd:sasl_ssl://localhost:9093/bootstrap]: sasl_ssl://localhost:9093/bootstrap: Connect to ipv4#127.0.0.1:9093 failed: Unknown error (after 2056ms in state CONNECT)
%3|1641897036.073|FAIL|rdkafka#producer-1| [thrd:sasl_ssl://localhost:9093/bootstrap]: sasl_ssl://localhost:9093/bootstrap: Connect to ipv6#[::1]:9093 failed: Unknown error (after 2041ms in state CONNECT)
%3|1641897039.120|FAIL|rdkafka#producer-1| [thrd:sasl_ssl://localhost:9093/bootstrap]: sasl_ssl://localhost:9093/bootstrap: Connect to ipv6#[::1]:9093 failed: Unknown error (after 2062ms in state CONNECT, 1 identical error(s) suppressed)
%3|1641897042.133|FAIL|rdkafka#producer-1| [thrd:sasl_ssl://localhost:9093/bootstrap]: sasl_ssl://localhost:9093/bootstrap: Connect to ipv4#127.0.0.1:9093 failed: Unknown error (after 2055ms in state CONNECT)
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
PanicLion
  • 187
  • 1
  • 9
  • 1) `sasl.mechanism` should not be `PLAIN` if you are using `SASL_SSL`; you'll want to try `SCRAM-SHA-256` 2) You need to show your actual broker settings. If you can get `kafka-console-producer` to work with JKS certificates, then you can extract PEM files needed for Python from it – OneCricketeer Jan 11 '22 at 16:33
  • @OneCricketeer about the first point: I used the configuration which I found on confluent's example for python https://docs.confluent.io/platform/current/tutorials/examples/clients/docs/python.html – PanicLion Jan 11 '22 at 17:59
  • Those examples are for Confluent Cloud, though, not running your own secured Kafka instances. In other words, you'd need to configure your brokers the same way they have – OneCricketeer Jan 11 '22 at 18:02
  • it is mentioned that it can also be configured for local setup. Actually, this same configuration has been implemented in java and I'm trying to replicate it in python – PanicLion Jan 11 '22 at 18:08
  • Do you get similar errors following blogs like this https://dev.to/adityakanekar/connecting-to-kafka-cluster-using-ssl-with-python-k2e – OneCricketeer Jan 11 '22 at 18:12
  • I didn't follow that blog because the kafka-python library is used there and I didn't find the same configuration for confluent-kafka library – PanicLion Jan 11 '22 at 18:30
  • The general steps are the same. You get a `cacert` + PEM file, then point the client at it. Otherwise, you need to modify your OS files such that the `certifi` module actually works to pick up your OS cacert. https://github.com/confluentinc/confluent-kafka-python#ssl-certificates – OneCricketeer Jan 11 '22 at 18:35

0 Answers0