I am using Reactive kafka for consuming events. Problem: I pushed 7 events to the queue, but the consumer only consumed 5 of them. (Only happening when deployed on a server, working fine in local environment). This has happened a lot of times, and we are not able to figure out what's the cause here. I am newbie to reactive programming, please do suggest better code practices.
@PostConstruct
List<KafkaReceiver<String, String>> kafkaReceiverList = new ArrayList<>();
for (int i = 0; i < 5; i++) {
kafkaReceiverList.add(KafkaReceiver.create());
}
@EventListener(ApplicationStartedEvent.class)
for (KafkaReceiver<String, String> receiver : kafkaReceiverList) {
kafkaReceivers.add(receiver
.receive()
.log()
.bufferTimeout(500,10)
.flatMap(this::processRecord) // input - List<ReceiverRecord<String, String>>
.flatMap(this::commitRecord) // input - List<ReceiverRecord<String, String>>
.subscribe());
}
public Flux<Void> commitRecord(List<ReceiverRecord<String, String>> records) {
log.info(InfoMessageConstants.COMMIT_RECORD, records);
records.forEach(record -> record.receiverOffset().commit().subscribe());
return Flux.empty();
}
@PreDestroy
kafkaReceiverList.forEach(consumer -> {
try {
kafkaReceivers.stream()
.forEach(Disposable::dispose);
} catch (Exception ex) {
log.error("Error closing consumer: ", ex);
}
});
Why creating a list of receiver?
To create consumers on the basis of partitions, and have more control over the number of consumers and partitions separately.
Is it reproducible in local environment?
No
I am looking for reason, why some events are lost when I start service with this consumer.
Steps to reproduce on a server:
1. Stop consumer/Service
2. Push events to topic
3. Start Consumer.