Questions tagged [kafka-python]

Kafka-Python provides low-level protocol support for Apache Kafka as well as high-level consumer and producer classes. Request batching is supported by the protocol as well as broker-aware request routing. Gzip and Snappy compression is also supported for message sets.

kafka-python provides low-level protocol support for Apache Kafka as well as high-level consumer and producer classes. Request batching is supported by the protocol as well as broker-aware request routing. Gzip and Snappy compression is also supported for message sets.

For more details about Python Kafka Client API, please refer https://kafka-python.readthedocs.io/en/latest/

443 questions
0
votes
2 answers

Parallelism at Kafka Topics or Partitions Level

In order to seperate my data, based on a key: Should I use multiple topics or multiple partitions within same topic? I'm asking on basis of overheads, computation, data storage and load caused on server.
Prannoy Mittal
  • 1,525
  • 5
  • 21
  • 32
0
votes
1 answer

Reading oldest available message in Kafka using KafkaConsumer instance of kafka-python kafka client

I try to read messages in kafka consumer using the following command: bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test --from-beginning Here we can read old messages of about 4 days as we have set retention time in kafka server…
Joy
  • 4,197
  • 14
  • 61
  • 131
-1
votes
1 answer

Kafka-python KafkaProducer __init__ takes 1 positional argument but 2 were given

I am encountering the following error: Traceback (most recent call last): connection = KafkaProducer(kafka_settings['topic'], bootstrap_servers=kafka_settings['bootstrap_servers']) TypeError: __init__() takes 1 positional argument but 2 were…
morhc
  • 194
  • 11
-1
votes
1 answer

KafkaConnectionError: 111 ECONNREFUSED

I want to produce and consume kafka topic through a simple python script. As explained in: https://towardsdatascience.com/getting-started-with-apache-kafka-in-python-604b3250aa05 I have created publish_message and connect_kafka_producer using…
-1
votes
2 answers

How can i put into kafka Producer?

producer.send() accepting 2 parameters one is kafka topic and 2nd is generated output.How can we make kafkaproducer.py using below scripts.*** Kindly help me to merge both python into single file so that we can use for this script to push the data…
-1
votes
2 answers

How to push uniqe messages in different partitions of a topic

I have created a topic in Kafka with partition count 3 now in all these three partitions I want to push unique messages. Is there any way to do it? I checked producer.send pushes duplicate messages on all partitions. For testing I am using…
Avinash
  • 2,093
  • 4
  • 28
  • 41
-1
votes
1 answer

PySpark Processing Stream data and saving processed data to file

I am trying to replicate a device that is streaming it's location's coordinates, then process the data and save it to a text file. I am using Kafka and Spark streaming (on pyspark),this is my architecture: 1-Kafka producer emits data to a topic…
MrRobot
  • 483
  • 1
  • 7
  • 18
-2
votes
1 answer

I need help to remove hard coded values from kafka-python consumer

like : topic_name = "auto-remediation" address = "sape-zookeeper-1" I need to remove hard coded values and pass the variable here : consumer = KafkaConsumer(b'auto-remediation', bootstrap_servers='sape-zookeeper-1:9092')
1 2 3
29
30