Questions tagged [kafka-python]

Kafka-Python provides low-level protocol support for Apache Kafka as well as high-level consumer and producer classes. Request batching is supported by the protocol as well as broker-aware request routing. Gzip and Snappy compression is also supported for message sets.

kafka-python provides low-level protocol support for Apache Kafka as well as high-level consumer and producer classes. Request batching is supported by the protocol as well as broker-aware request routing. Gzip and Snappy compression is also supported for message sets.

For more details about Python Kafka Client API, please refer https://kafka-python.readthedocs.io/en/latest/

443 questions
0
votes
1 answer

Kafka topic appears to be only consumed by one KafkaConsumer instance in Airflow

I have a system where I start two separate processes, one that produces messages to a kafka topic, and one process that runs a number of KafkaConsumer instances, and consumes the messages. However, it appears that only one of the consumers is…
Frank
  • 619
  • 1
  • 6
  • 26
0
votes
0 answers

Unable to Send Messages through Kafka-Python Producer

I've dockerize kafka using wurstmeister/kafka image. I'm new to Kafka and integrating with my django project. I've created a management commands for consumer.py and producer.py. docker-compose.yml services: zookeeper: image:…
0
votes
2 answers

Reading the last message from Kafka

I am trying to read the last message from a Kafka topic but I cannot make it work. I tried different methods that you can find below with their errors or problems Topic description: $ kafka-topics.sh --bootstrap-server localhost:9092 --topic…
AVarf
  • 4,481
  • 9
  • 47
  • 74
0
votes
1 answer

How to reduce Kafka lag? and How to create multiple consumers for same topic?

def getKafkaConsumer(topicName, kafkaIP, kafkaPort, kafkaConsumerGroup): try: bootstrap = [str(kafkaIP + ':' + kafkaPort)] consumer = KafkaConsumer(topicName, bootstrap_servers = bootstrap, group_id = kafkaConsumerGroup, session_timeout_ms = 30000,…
0
votes
1 answer

How to use KafkaConsumer with Django4

I have a Django 4 project and using KafkaConsumer from kafka-python. I want to update django models after receiving a Kafka message. The goal here is to have some Kafka worker running and consuming message, it is also should able to have access to…
Tim
  • 1
  • 1
0
votes
1 answer

in Apache Kafka, can I limit the number of partitions assigned to a certain consumer?

I'm using Kafka to distribute load between my AI agents but my servers have different configurations and different rates for processing the input data, a few of them remain mostly idle while others lag behind. it's because right now Kafka's…
aSaffary
  • 793
  • 9
  • 22
0
votes
0 answers

I create Kafka in Googlecloud. But i can't connect with Kafka and Databricks

I create Kafka in Googlecloud. But i can't connect with Kafka and Databricks
0
votes
1 answer

Deserialisation to Protobuf object in Python

I have passed a Protobuf object in a Kafka producer and am receiving a byte array on the consumer side. Now I want to deserialize that response again back to a Protobuf object, but I am unable to do that. How can I do that? Here is my consumer: from…
0
votes
1 answer

Which time value gets assigned to kafka record message.timestamp (kafka-python library)?

In the official documentation of kafka-python library there is literally nothing about what value is assigned to a consumed record .timestamp attribute.
0
votes
0 answers

Headers are coming as empty in Kafka-python consumer

We are publishing header information using kafka-python like below: def publish_message1(self, interface_time, topic_name, value, success_call_back, error_call_back, key=None): if self.kafka_producer: value_bytes = bytes(value,…
0
votes
1 answer

Python Pandas - Simulate Streaming to Kafka

I am trying to practice some Kafka producing / consuming and am trying to set up a simulated 'stream' of data. I have tried looping through with time.sleep(0.0000001) but it is too slow to catch the entries. Here is what I am trying to do: offsets =…
0
votes
2 answers

Adjust logging level for Kafka-python admin client

While creating the KafkaAdminClient client = KafkaAdminClient(bootstrap_servers=bootstrap_servers, security_protocol=security_protocol, sasl_mechanism=SASL_MECHANISM, …
Tushar
  • 528
  • 4
  • 20
0
votes
1 answer

kafka-python basic producing and consuming not working

I am new to kafka and I'm trying to run basic example. My kafka is running with this config: https://developer.confluent.io/quickstart/kafka-docker/ python 3.7; kafka installation as follows: pip install kafka-python (2.0.2) I follow this doc; then…
Rugnar
  • 2,894
  • 3
  • 25
  • 29
0
votes
0 answers

confluent-kafka python: List topics of consumer group

With the aim of getting consumer group to topic mapping, similar to that in Kafka CLI, I was trying to use the describe_config api offered by AdminClient ./kafka-consumer-groups.sh --bootstrap-server localhost:9092 --describe --group testing-group…
Tushar
  • 528
  • 4
  • 20
0
votes
1 answer

Kafka-python Get replication factor for a topic

Using this library, I was able to get the partition count per topic but not able to get the replication factor. Closest question I could find was this for reference. Any ideas on how to do it?