0

I have created Kafka cluster on a Google Cloud VM Firstly, I tested my broker using cli command to produce message: Producer:

$ kafka-console-producer.sh --broker-list localhost:9092 --producer.config /opt/bitnami/kafka/conf/producer.properties --topic lus_topic
>abc

Successfully received by the consumer:

$ kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic lus_topic --consumer.config /opt/bitnami/kafka/conf/consumer.properties --from-beginning
abc

Then I tried with kafka-python producer with cli consumer to retrieve the topic

Python 3.7.3 (default, Jan 22 2021, 20:04:44) 
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from kafka import KafkaProducer
>>> producer = KafkaProducer(bootstrap_servers='localhost:9092')        
>>> producer.send('lus_topic', b'Hello, World!').get(timeout=30)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/lumo_gftdevgcp_com/.local/lib/python3.7/site-packages/kafka/producer/kafka.py", line 576, in send
    self._wait_on_metadata(topic, self.config['max_block_ms'] / 1000.0)
  File "/home/lumo_gftdevgcp_com/.local/lib/python3.7/site-packages/kafka/producer/kafka.py", line 703, in _wait_on_metadata
    "Failed to update metadata after %.1f secs." % (max_wait,))
kafka.errors.KafkaTimeoutError: KafkaTimeoutError: Failed to update metadata after 60.0 secs.

Can you help me on why I am getting this Timeout Error. How to debug this issue.

Many thanks

Luke Mao
  • 87
  • 6
  • Can you check if the topic is created in port 9092? If it is not 9092, it generally results in `TimeoutError`. – sotmot Apr 13 '21 at 10:05
  • thanks @sotmot. I think so. As I have tested uing cli kafka-console-producer.sh --broker-list localhost:9092 --producer.config /opt/bitnami/kafka/conf/producer.properties --topic lus_topic – Luke Mao Apr 13 '21 at 10:19
  • 1
    When you ran the cli tools, in both cases you provided additional configs via `--producer.config` and `--consumer.config`. I'm guessing you have connectivity settings in these files that need to be passed to the Python client too – Mickael Maison Apr 13 '21 at 10:23
  • 1
    Thanks @Mickael, I have the following settings in the config: bootstrap.servers=localhost:9092 compression.type=none security.protocol=SASL_PLAINTEXT sasl.mechanism=PLAIN – Luke Mao Apr 13 '21 at 10:29
  • I tried with producer = KafkaProducer(bootstrap_servers='localhost:9092', api_version=(0, 10, 0), security_protocol='SASL_PLAINTEXT', sasl_mechanism='PLAIN') but still getting the same error (KafkaTimeoutError) – Luke Mao Apr 13 '21 at 10:34

1 Answers1

0

I fixed the issue by providing sasl username/pwd:

>>> producer = KafkaProducer(bootstrap_servers='localhost:9092',security_protocol='SASL_PLAINTEXT', sasl_mechanism='PLAIN', sasl_plain_username='user', sasl_plain_password='GGGGGG')
>>>producer.bootstrap_connected()
True
>>> producer.send('lus_topic', b'Hello, World!')
<kafka.producer.future.FutureRecordMetadata object at 0x7fe3eb8ebbe0>

Luke Mao
  • 87
  • 6