Questions tagged [confluent-kafka-python]

Confluent Kafka Python is a performant implemention of Kafka producers, consumers and the admin client in Python and it is based on librdkafka.

Confluent Kafka Python is a performant implemention of a Kafka client library based on librdkafka. You can use it to connect to Kafka brokers in Python.

219 questions
0
votes
0 answers

confluent-kafka Timed out OffsetCommitRequest

im using confluent-kafka 1.8.2. Faced a problem, after N messages the consumer crashes with an error. GroupCoordinator/5: Timed out OffsetCommitRequest in flight (after 60165ms, timeout #0) Please help me understand what is wrong and how to fix…
0
votes
0 answers

How to configure TLS/SSL in locally installed Kafka?

I am trying to configure SSL in Kafka (Installed locally on my Windows). I am using confluent-kafka python client. Most of the solutions are for java and it involves creating some truststore, keystore and jass configuration which is not clearly…
PanicLion
  • 187
  • 1
  • 9
0
votes
0 answers

What will happen if commit() method called on consumer running in auto commit mode?

I'm calling the consumer.commit() method on each message of consumer running in auto.commit=True mode. When I tried it locally it worked without any data loss or data duplication. What are the effects the commit() method can give to the…
0
votes
0 answers

Python Azure Functions kafka connection pool

In the Azure function, how do we make the Kafka producer connection as a singleton or connection pooling. Each time function is triggered a new Kafka connection is being created. import json import logging import os, time import azure.functions as…
0
votes
1 answer

Kafka consumer: how to check if all the messages in the topic partition are completely consumed?

Is there any API or attributes that can be used or compared to determine if all messages in one topic partition are consumed? We are working on a test that will use another consumer in the same consumer group to check if the topic partition still…
0
votes
0 answers

Routing a whole message from one queue to another in Kafka

I am building a dead letter queue. What I am interested in implementing is the ability to forward a message in its entirety to another queue (when there is, for example, an error). In other words, I am interested in preserving the metadata of the…
Zeruno
  • 1,391
  • 2
  • 20
  • 39
0
votes
1 answer

How can I get schema before consuming?

I use python confluent-kafka 1.5.0 with schema registry for consuming avro message from kafka. I am only consumer, without access to admin producer or broker or something else. I know topics name and from message fields i get subject and namespace…
0
votes
1 answer

confluent kafka producer avro schema error ClientError: Schema parse failed: Unknown named schema

I'm working in producer side of kafka to push message in topic. I'm using confluent-kafka avro producer. Liked issue on github Below are the my schema .avsc files. Keys.avsc { "namespace": "io.codebrews.schema.test", "type": "record", …
Avi
  • 1,424
  • 1
  • 11
  • 32
0
votes
1 answer

Kafka offset manual commit not updating offset value

My python confluent kafka code to read from the Kafka broker looks as below self.consumer = Consumer( { "auto.offset.reset": "earliest", "enable.auto.commit": False, } ) while True: …
Zaks
  • 668
  • 1
  • 8
  • 30
0
votes
1 answer

Disable Certificate validation in SchemaRegistryClient Confluent Kafka

So, I want to read a topic from kafka (Confluent) where data lies in Avro format. For certain unavoidable reasons , I would like to disable certificate validation. I am using security.protocol= SASL_SSL and sasl.mechanisms= OAUTHBEARER I can connect…
0
votes
1 answer

No such configuration property: "schema.compatibility.level" when trying to initialise a kafka producer

I am using confluent-kafka. My code is producer = SimpleAvroProducer(producer_id="producer_1", topic_name="events_topic", broker_host= brkr_host, broker_port=…
0
votes
1 answer

Set partition-key-expression config in Kafka Python

I'm using confluent's kafka python package. I would like to add a configuration property to the Producer called partition-key-expression in spring (Java) (See this ref for more info) The way I'm now instantiating the producer is the…
0
votes
0 answers

Apache Kafka Kerberos Authentication with Python

I need to develop a Python program which would act as a Kafka Consumer and do some processing based on that. I used kafka-python and it did the job fine with local testing. However, my production environment (RHEL7) requires Kerberos authentication.…
0
votes
1 answer

Read only new messages in a kafka topic

I'm creating a consumer with confluent-kafka in python, I want to create it in a way that if the consumer is restarted, it starts from the last available message in the topic (per partition), it doesn't matter if it lefts behind messages without…
Rodrigo A
  • 657
  • 7
  • 23
0
votes
1 answer

How can we use json.dumps directly in class constructor, instead of calling it via separate function?

How can we use json.dumps directly in the constructor, instead of calling it via a separate function? def json_serialize(obj, *args): return json.dumps(obj) class KafkaProducer(object): def __init__(self, config): config = { …
Akash Pagar
  • 637
  • 8
  • 22