Questions tagged [confluent-kafka-python]

Confluent Kafka Python is a performant implemention of Kafka producers, consumers and the admin client in Python and it is based on librdkafka.

Confluent Kafka Python is a performant implemention of a Kafka client library based on librdkafka. You can use it to connect to Kafka brokers in Python.

219 questions
0
votes
1 answer

confluent-kafka Python package in Snowpark

I am looking for a way to connect to Confluent Platform to consume messages into Snowflake directly. I do not believe the Confluent Snowflake Sink Connector will work, as we are using Confluent Platform, not cloud. I tried importing the…
0
votes
0 answers

Extracting values from TopicPartition - confluent kafka

Trying to extract the tuple values from TopicPartition like topic, partition, offset etc. Could not locate a helper method. Added code snippet below from confluent_kafka import DeserializingConsumer conf = { 'bootstrap.servers':…
madmatrix
  • 205
  • 1
  • 4
  • 12
0
votes
1 answer

Cannot get offset by timestamp in redpanda cluster

I deployed a redpanda cluster, and would like to query offset by timestamp. I first tried confluent-kafka python library: import confluent_kafka as ck import uuid c = ck.Consumer({ 'bootstrap.servers': 'redpanda-bootstrap.example.com:9094', …
0
votes
1 answer

confluent kafka python - certificate verification

I used simple producer on Windows, but when I tried it to run on Ubuntu I got: SSL handshake failed: error:0A000086:SSL routines::certificate verify failed: broker certificate could not be verified, verify that ssl.ca.location is correctly…
0
votes
0 answers

What are possible reasons that I got messages from kafka topic sometimes only?

I'm using confluent_kafka package in python to do event streaming locally. There are multiple stages in my program. The first stage sends output as a message to a topic designated to the next stage. Each subsequent stage does the same (retrieve…
0
votes
1 answer

How to deserialize the avro data generated by debezium correctly

I'm using debezium to capture data change from Mysql, the connect configuration is: { "name": "avro-mysql-cdc-payments-connector", "config": { "key.converter": "io.confluent.connect.avro.AvroConverter", …
0
votes
0 answers

Getting 0 records from the partition in Kafka Consumer Message while reading partition parallel using multiprocessing package in python

from multiprocessing import Pool part_process_list - [{"topic":"","partition":1,"s_offset":"","e_at":""},{"topic":"","partition":2,"s_offset":"","e_at":""},{"topic":"","partition":3,"s_offset":"","e_at":""}] with Pool(5) as executor: …
0
votes
1 answer

Consume only the unprocessed messages from a kafka topic

I'm a beginner in Kafka and trying to consume the latest unconsumed or un processed messages on a topic and below is the function I came up with. It works fine but have a logical problem though, it is returning the last consumed message again and…
Rafa S
  • 45
  • 5
0
votes
0 answers

Can't connect to Kafka with confluent-kafka python library

I am using the KafkaProducer libraray to publish messages such as follows: from kafka import KafkaProducer ssl_produce = KafkaProducer(bootstrap_servers='xxxx:xxx', security_protocol='SASL_SSL', …
user3579222
  • 1,103
  • 11
  • 28
0
votes
0 answers

"errorMessage": "Unable to import module 'lambda_function': No module named 'confluent_kafka.cimpl'",

stuck with this message: "errorMessage": "Unable to import module 'lambda_function': No module named 'confluent_kafka.cimpl'", and I'm sure if I unzip my confluentlayers.zip then inside it i have the confluent_kafka folder. Been following multiple…
0
votes
0 answers

Kafka Consumer ID disappears

I am facing this issue quite often, Consumer ID disappears for a topic (see the attached image). I am assuming that means there are no consumer available to consumer messages. what could be the reason any idea? I am using confluent Kafka 5.5…
st_bones
  • 119
  • 1
  • 3
  • 12
0
votes
0 answers

Kafka EOS read-process-write missing some messages

I am currently experimenting with Kafka Exactly-Once Semantics in Python with the Confluent Kafka library. I have 3 programs : One sending incremental integers in a topic called INPUT_TOPIC with 2 partitions. The second one implements…
TheProphet
  • 21
  • 2
0
votes
1 answer

Python: ssl.ca.location error to read messages from Kafka

Need help, below code is working fine when I run it on Linux(VM) system but the same code if run on Kubernetes it throws an error. I am using same certificate file both on Linux and Kubernetes system. I validated, file is present on both locations,…
AnumNuma
  • 31
  • 3
0
votes
1 answer

How can I consume data from specific partitions in Kafka using Confluent Python?

I tried this code but it is consuming from other partitions as well but my req is to consume from 1 and 2 partition ConsumerKafka.subscribe(topic_list) ConsumerKafka.assign([TopicPartition(topic_name, 1, 2)])
0
votes
1 answer

Deserialize avro message without schema

I have to decode avro messages without any schema as there are some fields where logical type is defined as date and there are values in the filed which are garbage value and because of that all the records are rejecting, Is there a way to consume…