3

I'm trying to enable SASL/PLAIN for my Kafka system. It works actually, I've tested it for Schema Registry and a Java producer. The problem is Kafka Connect can not establish a connection when SASL is enabled (at least that's what I thought first). I gave the necessary configuration but looks like it doesn't affect at all. I've edited my connect-avro-distributed.properties file as:

sasl.mechanism=PLAIN
security.protocol=SASL_PLAINTEXT
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
  username="admin" \
  password="secret";

producer.sasl.mechanism=PLAIN
producer.security.protocol=SASL_PLAINTEXT
producer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
  username="admin" \
  password="secret";

consumer.sasl.mechanism=PLAIN
consumer.security.protocol=SASL_PLAINTEXT
consumer.sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
  username="admin" \
  password="secret";

But in the log it says:

[2022-01-07 12:21:28,645] INFO ProducerConfig values:
        sasl.mechanism = GSSAPI
        security.protocol = PLAINTEXT

Which should be like:

[2022-01-07 12:21:28,645] INFO ProducerConfig values:
        sasl.mechanism = PLAIN
        security.protocol = SASL_PLAINTEXT

Same for the consumer config. What do I need to do? Why it goes with default values? I've restarted the service many times. Thanks in advance.

Edit: There is another connector which runs without any problems and it has correct configuration for SASL.

Edit2: Looks like Debezium connectors need some more configuration in connector side.

  • 1
    How are you launching the Kafka Connect worker? Is it definitely using the config file they're you're editing and not another one? – Robin Moffatt Jan 07 '22 at 13:48
  • yes I've tested it, if I change the broker url in that file it won't work. I just found something interesting, there are 2 connectors running in that server and in log files the other connector has correct values for sasl. the working one is a jdbc connector and the other one is a debezium. is debezium connector somehow overriding these fields? @RobinMoffatt – Bünyamin Şentürk Jan 07 '22 at 14:05
  • Broker URL in which file? And yes, each connector can override client settings using `producer.override` prefix, for example. This is mentioned in the docs - https://kafka.apache.org/documentation/#connect_running ... You can use the `/config` REST API to see the debezium config vs the JDBC config – OneCricketeer Jan 07 '22 at 16:49

1 Answers1

4

Trying it with different connectors made it clear that there was a Debezium specific problem. Since Debezium uses history topics, it needs some additional configuration when security is enabled.

"database.history.consumer.security.protocol": "SASL_PLAINTEXT",
"database.history.consumer.sasl.mechanism": "PLAIN",
"database.history.consumer.sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"admin\" password=\"secret\";",
"database.history.producer.security.protocol": "SASL_PLAINTEXT",
"database.history.producer.sasl.mechanism": "PLAIN",
"database.history.producer.sasl.jaas.config": "org.apache.kafka.common.security.plain.PlainLoginModule required username=\"admin\" password=\"secret\";"

You need to override default values for both producer and consumer configs of Debezium connector. There are a few lines you need to add if you are using SSL. For more information:https://docs.confluent.io/debezium-connect-sqlserver-source/current/sqlserver_source_connector_config.html