I made a simple configuration with Kafka. I ran it as a service on 2 Centos servers. While I can send and see logs to the logs topic with my python scripts that I created on one of my servers, I only send them on my other server and I cannot view them on the consumer.I don't get any errors while doing these. What could be the reason for this?
kafka_consumer.py:
from kafka import KafkaConsumer
broker_url = "localhost:9092"
topics = ["logs"]
consumer = KafkaConsumer(bootstrap_servers=broker_url,
group_id="my-group",
auto_offset_reset="earliest",
value_deserializer=lambda x: x.decode("utf-8"))
consumer.subscribe(topics)
for message in consumer:
print(message.value)
# consumer.close()
kafka_producer.py:
from kafka import KafkaProducer
broker_url = "localhost:9092"
topic_name = "logs"
producer = KafkaProducer(bootstrap_servers=broker_url,
value_serializer=lambda x: x.encode("utf-8"))
for i in range(10):
log_message = "TEST"
producer.send(topic_name, log_message)
producer.flush()
# producer.close()
I cleaned the kafka-logs folder I created under the /var/log/ directory and created it again, but the problem was not resolved.
I ran the kafka-service-stop.sh and zookeeper-service-stop.sh files and restarted the service.
I'm pretty sure Kafka Broker is working.