I'm trying to get filebeat to consume messages from kafka using the kafka input. I'm unable to authenticate with SASL for some reason and I'm not sure why that is. The documentation for both Kafka and Filebeat is a little lacking when trying to use it with SASL.
My filebeat configuration is as follows:
filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
filebeat.inputs:
- type: kafka
hosts: 'the.kafka.server.com:9092'
topics: 'my_topic'
group_id: 'my_group'
ssl.enabled: yes
username: "$ConnectionString"
password: "org.apache.kafka.common.security.plain.PlainLoginModule required username='my_username' password='my_password';"
processors:
- add_cloud_metadata: ~
- add_docker_metadata: ~
output.console:
pretty: true
The output shows
INFO input/input.go:114 Starting input of type: kafka; ID: 14409252276502564738
INFO kafka/log.go:53 kafka message: Initializing new client
INFO kafka/log.go:53 client/metadata fetching metadata for all topics from broker the.kafka.server.com:9092
INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
INFO cfgfile/reload.go:171 Config reloader started
INFO cfgfile/reload.go:226 Loading of config files completed.
INFO kafka/log.go:53 kafka message: Successful SASL handshake. Available mechanisms: %!(EXTRA []string=[PLAIN OAUTHBEARER])
INFO kafka/log.go:53 Failed to read response while authenticating with SASL to broker the.kafka.server.com:9092: EOF
INFO kafka/log.go:53 Closed connection to broker the.kafka.server.com:9092
INFO kafka/log.go:53 client/metadata got error from broker -1 while fetching metadata: EOF
I'm not sure what's happening here. I've also tried adding compression: none
which didn't help and verified with openssl that the server certificate is able to be verified. What am I doing wrong here? The kafka server in question is a cloud hosted kafka server and I can't see the server configuration, I was given the "connection string" from kafka's cloud UI.