0

Our system receives messages to fetch data from remote service and then store it into the database. Currently, it opens multiple connections with the database to save the fetched data for each request. We want to convert it into a process with multiple producers(fetching data from remote service) and a single consumer to persist data in the database. Doing this it will hold only one connection at most to persist data in the database.

We are using spring-boot with a reactor. We want to have a publisher publishing all the data fetched from the remote service which we can subscribe to and push this data in a batch of say 200 records in the database.

For example, I am planning to us following code to consume messages from ActiveMQ queue:

    public Publisher<Message<RestoreMessage>> restoreMessagesSource() {
        return IntegrationFlows
            .from(Jms.messageDrivenChannelAdapter(this.connectionFactory)
                .destination(RestoreMessage.class.getSimpleName() + "Queue"))
            .channel(MessageChannels.queue())
            .log(LoggingHandler.Level.DEBUG)
            .log()
            .toReactivePublisher();
    }

In this code message from the ActiveMQ qeueu are put into a ReactivePublisher. This publisher has been subsribed. This way we are conusming the messages from the queue.

In a similar fashion, we want the response of all the remote API to be pushed to a publisher which we can process in a subscriber at one place.

Munish Dhiman
  • 511
  • 1
  • 7
  • 22
  • May we see what you have so far and what doesn't fit your requirement - the place you would like to change. I'm confused a bit that you say `reactor` and show `spring-integration` tag. Thanks – Artem Bilan Jun 02 '19 at 22:52
  • @ArtemBilan, Thank you for your response. I have updated my question. I hope it is more clear now. – Munish Dhiman Jun 03 '19 at 04:30

1 Answers1

0

Sounds like you are going to have several Publisher<Message<?>> and you want to consume them all in a single subscriber. For this reason you can use:

/**
 * Merge data from {@link Publisher} sequences contained in an array / vararg
 * into an interleaved merged sequence. Unlike {@link #concat(Publisher) concat},
 * sources are subscribed to eagerly.
 * <p>
 * <img class="marble" src="doc-files/marbles/mergeFixedSources.svg" alt="">
 * <p>
 * Note that merge is tailored to work with asynchronous sources or finite sources. When dealing with
 * an infinite source that doesn't already publish on a dedicated Scheduler, you must isolate that source
 * in its own Scheduler, as merge would otherwise attempt to drain it before subscribing to
 * another source.
 *
 * @param sources the array of {@link Publisher} sources to merge
 * @param <I> The source type of the data sequence
 *
 * @return a merged {@link Flux}
 */
@SafeVarargs
public static <I> Flux<I> merge(Publisher<? extends I>... sources) {

So, you are going to sink all your sources to one Flux and will subscribe to this one.

Pay attention to the Note. The .toReactivePublisher() indeed produces an infinite source, although, according the Jms.messageDrivenChannelAdapter() it is done in its specific thread from an executor in listener container. So, try it as is or wrap each source to the Flux with particular publishOn().

Artem Bilan
  • 113,505
  • 11
  • 91
  • 118