1

I want to create a local kafka setup using docker-compose that replicates very closely the secured kafka setup in confluent cloud.

The cluster I have in Confluent Cloud can be connected to using

c = Consumer(
    {
        "bootstrap.servers": "broker_url",
        "sasl.mechanism": "PLAIN",
        "security.protocol": "SASL_SSL",
        "sasl.username": "key",
        "sasl.password": "secret",
        "group.id": "consumer-name",
    }
)

But I am unable to create a docker-compose.yml locally that has the same config and can be connected to using the same code.

version: '3'
services:
    zookeeper:
        image: confluentinc/cp-zookeeper:6.2.0
        ports:
            - "2181:2181"
        environment:
            ZOOKEEPER_CLIENT_PORT: 2181

    kafka:
        image: confluentinc/cp-kafka:6.2.0
        depends_on:
            - zookeeper
        ports:
            - '9092:9092'
            - '19092:19092'
        expose:
            - '29092'
        environment:
            KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'
            KAFKA_LISTENERS: INSIDE-DOCKER-NETWORK://0.0.0.0:29092,OTHER-DOCKER-NETWORK://0.0.0.0:19092,HOST://0.0.0.0:9092
            KAFKA_ADVERTISED_LISTENERS: INSIDE-DOCKER-NETWORK://kafka:29092,OTHER-DOCKER-NETWORK://host.docker.internal:19092,HOST://localhost:9092
            KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE-DOCKER-NETWORK:PLAINTEXT,OTHER-DOCKER-NETWORK:PLAINTEXT,HOST:PLAINTEXT
            KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE-DOCKER-NETWORK
            KAFKA_AUTO_CREATE_TOPICS_ENABLE: 'true'
            KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
            # Allow to swiftly purge the topics using retention.ms
            KAFKA_LOG_RETENTION_CHECK_INTERVAL_MS: 100
            # Security Stuff
            KAFKA_LISTENER_NAME_EXTERNAL_PLAIN_SASL_JAAS_CONFIG: |
                org.apache.kafka.common.security.plain.PlainLoginModule required \
                username="broker" \
                password="broker" \
                user_alice="alice-secret";
            KAFKA_SASL_ENABLED_MECHANISMS: PLAIN
            KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL: SASL_SSL

Here is what I have in terms of the local docker-compose file but its not working This is the error I get when I try connecting using the same code

%3|1628019607.757|FAIL|rdkafka#consumer-1| [thrd:sasl_ssl://localhost:9092/bootstrap]: sasl_ssl://localhost:9092/bootstrap: SSL handshake failed: Disconnected: connecting to a PLAINTEXT broker listener? (after 9ms in state SSL_HANDSHAKE)
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Nilan Saha
  • 191
  • 1
  • 9
  • did you ever find a solution for this. I'm running to the same issue – Haha Oct 25 '21 at 12:05
  • @Haha not exactly. The whole security thing with SASL_SSL and stuff is extremely complicated to replicate locally because you need certs as far as I understand. I just set it up without the security stuff and configured into my application to be None for local testing – Nilan Saha Oct 25 '21 at 22:12

1 Answers1

0

Here's your hint: Disconnected: connecting to a PLAINTEXT broker listener?

KAFKA_LISTENER_SECURITY_PROTOCOL_MAP only has PLAINTEXT mappings, so there is no SASL_SSL connection that your client can use

For what it looks like you did configure to have SASL_SSL, you only have one broker, so KAFKA_SASL_MECHANISM_INTER_BROKER_PROTOCOL doesn't really do anything


In this demo repo you can find brokers that use all possible protocol mappings

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • I am so sorry I couldn't figure it out. There seems to be a lot of examples but any relevant ones I try with gives that error. – Nilan Saha Aug 03 '21 at 22:35
  • The [security tutorial is here](https://docs.confluent.io/platform/current/security/security_tutorial.html) and the gist is 1) you need certificates for the brokers and clients 2) you need to have `KAFKA_LISTENER_SECURITY_PROTOCOL_MAP` contain something with `:SASL_SSL` – OneCricketeer Aug 03 '21 at 22:41