How to force a certain number of messages to be read from Kafka by a batch. The service reads a random amount instead of the specified one.
In the kafka settings I specified the options max-poll-records: "500"
spring:
main:
allow-bean-definition-overriding: true
kafka:
listener:
type: batch
consumer:
enable-auto-commit: true
auto-offset-reset: latest
group-id: my-app
max-poll-records: "500"
fetch-min-size: "1000MB"
bootstrap servers:
"localhost:9092"
which shows how many messages should be read at one time (500 messages)
and specified the second parameter setIdleBetweenPolls("5000") in Kafka's config file:
@Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactoryBatch(
ConsumerFactory<String, String> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
var properties = new HashMap<String, Object>();
properties.putAll(consumerFactory.getConfigurationProperties());
factory.setConsumerFactory(new DefaultKafkaConsumerFactory<>(properties));
factory.setBatchListener(true);
factory.getContainerProperties().setIdleBetweenPolls("5000");
return factory;
}
this is the reading interval = 5 seconds.
Those. every 5 seconds, the service reads 500 messages from kafka, then another 500, then another 500, and so on.
Main problem: when I send 20 messages to kafka or 50 or 100 - there is no problem. The service reads all messages at a time. But if I send 500 messages to Kafka, or for example 10 000 messages, then the service reads randomly, not necessarily 500. It can read 500 messages at a time, or maybe less (for example, 200 and 300 or 150 and 300 and 50), etc.
P.S: I dug up a lot of information on the Internet and I don’t understand how to fix this problem and whether this is even possible. Please share your opinion and a possible solution to this problem.
Thank you all in advance!