0

Sample code:

consumer = KafkaConsumer(config["kafka"]["input"],
                         bootstrap_servers=config["kafka"]["brokers"].split(','),
                         value_deserializer=lambda m: json.loads(m.decode('ascii')),
                         enable_auto_commit=config["kafka"]["auto_commit"],
                         auto_commit_interval_ms=config["kafka"]["commit_interval"],
                         group_id=config["kafka"]["group"],
                         consumer_timeout_ms=config["kafka"]["timeout"]
                        )

Tried max_poll_records, fetch_max_bytes and even consumer.poll() method nothing worked.

1 Answers1

0

Confluent-Kafka python library allows to limit the number the messages.

Use:

consumer.consume(num_messages=config["kafka"]["number_of_messages_to_consumer"], timeout=config["kafka"]["timeout"]) to limit the consumption of messages.

Kafka-Python library won't support this.

tomerpacific
  • 4,704
  • 13
  • 34
  • 52