I try to consume some messages created with a spring-boot kafka producer on localhost.
For the consumer I have the following python code (consumer.py):
from kafka import KafkaConsumer
# To consume latest messages and auto-commit offsets
print("consume...")
consumer = KafkaConsumer('upload_media',
group_id='groupId',
bootstrap_servers=['host.docker.internal:9092'])
for message in consumer:
# message value and key are raw bytes -- decode if necessary!
# e.g., for unicode: `message.value.decode('utf-8')`
print("message")
print ("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition,
message.offset, message.key,
message.value))
And I have the dockerfile:
# Set python version
FROM python:3.6-stretch
# Make a directory for app
WORKDIR /consumer
# Install dependencies
COPY requirements.txt .
RUN pip install -r requirements.txt
# RUN pip install --no-cache-dir --user -r /req.txt
# Copy source code
COPY ./app/consumer.py .
# Run the application
#CMD ["python", "-m", "app"]
ENTRYPOINT [ "python3", "-u", "consumer.py" ]
CMD [ "" ]
When I build and run it, i get:
$ docker run --expose=9092 consumer
consume...
But it never gets some messages.
When I start the script just with python consumer.py
it works fine.
So something seems to be wrong with the docker image?
Edit: I started the ZooKeeper and a broker locally as it is described here: https://kafka.apache.org/quickstart
Edit 2: I don't know why the question is a duplicate. I don't use ksqlDB, don't use docker compose. I am not on a VM and I do not get any error.