Well, I am trying to use Kafka-python package(1.3.2) in python to have a simple data tansfer from my producer to consumer.
Producer:
from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers='localhost:9092')
# produce asynchronously
for _ in range(2):
producer.send('my-topic', b'message')
producer.flush()
producer = KafkaProducer()
Consumer:
from kafka import KafkaConsumer
consumer = KafkaConsumer('my-topic',
group_id='my-group',
bootstrap_servers=['localhost:9092'],fetch_min_bytes=1)
for message in consumer:
print ("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition,
message.offset, message.key,
message.value))
consumer = KafkaConsumer()
consumer.subscribe(["my-topic"])
I receive below on my consumer:
my-topic:0:5056: key=None value=b'message'
my-topic:0:5057: key=None value=b'message'
But at the same time I have this error at producer:
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "C:\Users\VNK736\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kafka\producer\kafka.py", line 364, in wrapper
_self.close()
File "C:\Users\VNK736\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kafka\producer\kafka.py", line 420, in close
self._sender.join(timeout)
File "C:\Users\VNK736\AppData\Local\Programs\Python\Python36-32\lib\threading.py", line 1060, in join
self._wait_for_tstate_lock(timeout=max(timeout, 0))
File "C:\Users\VNK736\AppData\Local\Programs\Python\Python36-32\lib\threading.py", line 1072, in _wait_for_tstate_lock
elif lock.acquire(block, timeout):
OverflowError: timeout value is too large
By default the timeout is set to NONE
, and is being set to 999999999
in Kafka.py
. I am unable to figure out the param to pass this timeout in KafkaProducer - in my producer code.
Has anyone faced this problem? Or could anyone help me in this direction. Thanks in advance.