Questions tagged [confluent-kafka-python]

Confluent Kafka Python is a performant implemention of Kafka producers, consumers and the admin client in Python and it is based on librdkafka.

Confluent Kafka Python is a performant implemention of a Kafka client library based on librdkafka. You can use it to connect to Kafka brokers in Python.

219 questions
3
votes
2 answers

converting from python-kafka to confluent kafka - how to create parity with SASL_SSL, OAUTHBEARER and Tokens

I have a python kafka that works and is the code: class TokenProvider(object): def __init__(self,client_id,client_secret): self.client_id = client_id self.client_secret = client_secret def token(self): token_url…
Tampa
  • 75,446
  • 119
  • 278
  • 425
3
votes
1 answer

Python confluent kafka raise exception on broker connection disconnect

I am using python 3.7 and confluent-kafka. Following is the pseudo code which I am using to poll the kafka server and read the message. while True: MSG = CONSUMER.poll(0.1) if MSG is None: …
rohitp
  • 61
  • 1
  • 7
3
votes
1 answer

How to programmatically update subject schema and compatibility in Confluent Schema Registry

I have a schema already registered in schema registry, which I was able to do using register() like this, from schema_registry.client import SchemaRegistryClient, schema subject_name = "new-schema" schema_url = "https://{{ schemaRegistry }}:8081"…
3
votes
1 answer

Sometimes a new consumer group does not work

I've seen this in production once (I don't remember how we solved it) and now I can repeat it in the integration tests, which always start with a brand new Kafka installation. Here's how it goes: Step 1: A consumer of a group that doesn't exist yet…
Antonis Christofides
  • 6,990
  • 2
  • 39
  • 57
2
votes
0 answers

Reset accumulated lag on all kafka partitions

Good afternoon. Given: a kafka topic with more than 20 partitions and a huge message traffic and one consumer group. It is necessary each time connecting to a topic to reset all the lag available there (with data loss) on all partitions and read it…
2
votes
0 answers

Unable to install confluent Kafka (1.7.0) through poetry: not supporting PEP 517 builds

So I am trying to pin confluent-kafka version to 1.7.0 to my poetry lock file. So in my toml file I would have: confluent_kafka = "1.7.0" But then when I run poetry update I get this error message: at…
2
votes
0 answers

Python - kafka - consume messages between two offsets

I'm an intern, trying to come up with a script that runs as cron job(hourly), collecting messages from a Kafka topic that arrived between two time intervals. Example, process messages that arrived between 09.00 AM - 10.00 AM, 10.00 AM - 11.00 AM and…
Rafa S
  • 45
  • 5
2
votes
0 answers

confluent_kafka python consumer.poll() function exits the entire code

I am referring to this tutorial for kafka which I have changed the video frames to image inputs. In my case, consumer.poll() does not return anything and breaks out of the code. In the docker terminal displays this: kafka1 | [2023-01-10…
2
votes
0 answers

Can't initialize transactions

I have local instances of Kafka and Zookeeper running in Docker on my local machine (macOS Monterey) which have worked quite well for my needs so far. I recently want to implement something with Kafka transactions and run into the problem that I…
2
votes
2 answers

confluent_kafka: how to reliably seek before reading data (avoiding Erroneous state)

I'm trying to switch Python code from aiokafka to confluent_kafka and having problems with reading historical data. The system has only one producer for a given topic, and several independent consumers (each with a separate group ID). When each…
Russell Owen
  • 393
  • 2
  • 14
2
votes
2 answers

How to delete and then create topics correctly in a script for Kafka?

I am working on a script to refresh my topics on an AWS managed Kafka cluster. I need to wipe out the existing data whenever I run the script and I did it by deleting and creating the same topics again. I expect the script to print out successful…
dhu
  • 718
  • 6
  • 19
2
votes
1 answer

Confluent-Kafka-Python: Get lag per topic partition

With confluent-kafka-python was wondering if via the Admin API or any of the other api's, I can fetch lag per topic partition? Found this link to be helpful but wanted to check if there is a direct api available instead?
2
votes
1 answer

Confluent kafka python API - how to get number of partitions in a topic

I would like to get the number of partitions within a topic but the API is difficult to understand at best. I found the following, but, the topic information doesn't contain the numbers of partitions. import confluent_kafka from…
hi im Bacon
  • 374
  • 1
  • 12
2
votes
2 answers

How to consume messages in last N days using confluent-kafka-python?

This question is similar to Python KafkaConsumer start consuming messages from a timestamp except I want to know how to do it in the official Python Kafka client by Confluent. I looked into the Consumer.offsets_for_times function but I'm confused by…
wxh
  • 619
  • 7
  • 20
2
votes
2 answers

Kafka/questDB JDBC Sink Connector: tables not created using "topics.regex"

I am using "confluentinc/kafka-connect-jdbc:10.2.6" as my JDBC connector to transfer Kafka topics into my questDB. When I provide explicit topic names it is working as expected. But when I use topic names based on regex then it's not working, the…
CarloP
  • 99
  • 1
  • 12
1
2
3
14 15