3

I am trying to create a stream in Spring Cloud Data Flow with

  1. One source i.e. order-source and Order message will be published to the RabbitMQ Topic/Queue.

  2. Two parallel processors i.e. product-processor and shipment-processor Both of these processors will be subscribers to the RabbitMQ Topic/Queue and gets the Order message and each of them will process these Order message individually and update the Order and the Order message will be published to the RabbitMQ Topic/Queue.

  3. One sink i.e. payment-sink The sink will be the subscriber to the RabbitMQ Topic/Queue and will get the Order message and based on the Order message, will complete the Payment process.

I tried the following command and deployed

stream create --name order-to-payment --definition 'order-source | product-processor|shipment-processor | payment-sink'

But graphically in Spring Cloud Data Flow it looks like as follows:

Stream Diagram

But I am looking something like below

Paraller processors

Is it possible to achieve this? As Spring Cloud Data Flow is not allowing it to connect two processors from one source as well as not able to connect from two processor to one sink graphically?

Thanks, David.

David
  • 504
  • 5
  • 21

1 Answers1

4

You can have DAG with dataflow using named destinations, please check it here: http://docs.spring.io/spring-cloud-dataflow/docs/1.2.2.RELEASE/reference/htmlsingle/#spring-cloud-dataflow-stream-advanced

You will need to send your messages via named destinations and compose the flow from them. Let me know if you have trouble going through the docs and I can post an example here.

Vinicius Carvalho
  • 3,994
  • 4
  • 23
  • 29
  • Tried with named destinations. 1. stream create --definition "order-source > :customSource" --name custom-order-source 2. stream create --definition ":customSource > shipment-processor | payment-sink" --name custom-shipment-to-payment 3. stream create --definition ":customSource > product-processor | payment-sink" --name custom-product-to-payment. Seems like it is working fine. Is this the correct way? I am trying an existing example with Spring Cloud Data Flow. The example without Spring cloud data flow is from https://dzone.com/articles/event-driven-microservices-using-spring-cloud-stre – David Jul 17 '17 at 00:10
  • Is this the correct way? Trying to understand the concept behind it. Is the named destinations just a way to break the issue I was facing into different streams and which was not possible with one stream which I was trying? Please correct me if the same can be done in different way using named destinations. – David Jul 17 '17 at 00:34
  • 2
    Yes, from the docs : " named destinations may be used as a way to combine the output from multiple streams or for multiple consumers to share the output from a single stream." So yes, we use named destination to allow the type of parallelism you wanted. Bear in mind that graphically dragging multiple outputs from a source would not allow us to know what were you trying to do: Are you partitioning? Splitting? Multiplexing ... So combining streams with named destinations gives you that flexibility :) – Vinicius Carvalho Jul 17 '17 at 01:31
  • @ViniciusCarvalho and David could you guys please give me example or explain me how payment service know shipment and product processing are done? will the payment service receive 2 message or just 1 message when those 2 processor are done. – Chi Dov Oct 06 '17 at 19:03
  • It depends on you logic, this is a typical Scatter-gather pattern. Payment needs to know when to trigger a done event. either by setting expectations on how many messages it should receive from each source, or by setting a temporal window. – Vinicius Carvalho Oct 10 '17 at 12:47