I have a Reactive Spring Boot application consuming messages from RabbitMQ and persisting them in a (MongoDB) repository:
@RabbitListener(...)
public void processMessage(Message message) {
repository.persist(message).subscribe();
}
Assuming multiple messages arriving in a short period of time, this code could deplete a configured ConnectionPool to the database. If I would receive the messages in a Flux
, I could concatMap()
them into the db or insert them in buckets of n documents.
That's why I tried to implement a bridge of the given RabbitMQ listener to a self-managed Flux:
@Component
public class QueueListenerController {
private final MyMongoRepository repository;
private final FluxProcessor<Message, Message> fluxProcessor;
private final FluxSink<Message> fluxSink;
public QueueListenerController(MyMongoRepository repository) {
this.repository = repository;
this.fluxProcessor = DirectProcessor.<Message>create().serialize();
this.fluxSink = fluxProcessor.sink();
}
@PostConstruct
private void postConstruct() {
fluxProcessor.concatMap(repository::persist)
.subscribe();
}
@RabbitListener(bindings = @QueueBinding(
value = @Queue(value = "my-queue", durable = "true", autoDelete = "false"),
exchange = @Exchange(value = "amq.direct", durable = "true", autoDelete = "false")
))
public void processMessage(Message message) {
fluxSink.next(message);
}
}
This works locally and for a certain period of time but after some time (I expect 12-24 hours) it stops storing messages in the database, so I'm quite sure I'm doing something wrong.
What would be the correct way of transforming incoming RabbitMQ messages into a Flux
of messages?