I am using kafka-python
(pip install kafka-python
) in a Flask application to send messages to a Kafka cluster (running version 0.11). The application is deployed to AWS elastic beanstalk
via docker
. However, I see no messages reaching Kafka (verified that with a console consumer).
I don't know much about docker except how to connect to a running container. So that's what I did. I logged into the beanstalk instance and then connected to the docker container. There, I ran the following commands in Python3.
>> from kafka import KafkaProducer
>> p = KafkaProducer(bootstrap_servers='my_kafka_servers:9092', compression_type='gzip')
>> r = p.send(topic = 'my_kafka_topic', value = 'message from docker', key = 'docker1')
>> r.succeeded()
>> False
>> p.flush()
>> r.succeeded()
>> False
>> p.close()
>> r.succeeded()
>> False
All this while, I had a console consumer running listening to that topic but I saw no messages come through.
I did the same exercise "outside" the docker container (i.e., in the beanstalk instance). I first installed kafka-python
using pip. Then ran the following in python3.
>> from kafka import KafkaProducer
>> p = KafkaProducer(bootstrap_servers='my_kafka_servers:9092', compression_type='gzip')
>> r = p.send(topic = 'my_kafka_topic', value = 'message outside the docker', key = 'instance1')
>> r.succeeded()
>> False
# waited a second or two
>> r.succeeded()
>> True
This time, I did see the message come through the console consumer.
So, my questions are:
- Why is docker blocking the kafka producer's sends?
- How can I fix this?
Is this something for which I need to post the docker configuration? I didn't set it up and so don't have that info.
EDIT I found some docker configuration specific info in the project.
{
"AWSEBDockerrunVersion": "1",
"Image": {
"Name": "<AWS_ACCOUNT_ID>.dkr.ecr.<REGION>.amazonaws.com/<NAME>:<VERSION>",
"Update": "true"
},
"Ports": [
{
"ContainerPort": "80"
}
],
"Logging": "/var/eb_log"
}