0

I have a source that sends messages over the default output channel to a processor in the stream. Now I want to also send failure messages over a different channel.

I figured I should create a bindable interface that extends from Source and adds the extra channel using @Output. How do I ensure that SCDF actually creates a Kafka topic for this channel? IOW, what would the stream definition look like?

E.g. something along the lines of

source | processor | sink source > error-sink

With source | processor using the regular output channel/Kafka topic, and source > error-sink using a different channel/topic.

Remon Sinnema
  • 51
  • 1
  • 10

1 Answers1

1

If the requirement is to keep track of the error messages for downstream processing, you could use the OOTB DLQ mechanics associated with Spring Cloud Stream. It is supported both in Rabbit and Kafka. You could enable DLQ in Spring Cloud Data Flow (SCDF) as a global setting or by per stream basis.

If you'd still like to define your custom channels to handle the messages differently, you'd have to create a custom interface similar to this sample.

While deploying the stream in SCDF, you can then override the destinations between producer and consumer via spring.cloud.stream.kafka.bindings.<channelName>.producer and spring.cloud.stream.kafka.bindings.<channelName>.consumer binding properties respectively.

EDIT:

Though there's the above approach, I learned about a much simpler solution from Spring Cloud Stream lead (@marius-bogoevici).

There is already a default error channel that's available for use and Spring Integration backs it.

With this, in your app, you could send custom messages to the default error channel via: @Autowire @Qualifier("errorChannel"). In fact, this support is also available for all the OOTB applications.

You could then override the destination of this error channel via: spring.cloud.stream.bindings.error.destination=errorchannel-test.In SCDF, you'd pass this at the time of stream deployment via: --properties.

For example:

stream create foo --definition "mysource | log"

stream deploy foo --properties "app.mysource.spring.cloud.stream.bindings.error.destination=errorchannel-test"

Community
  • 1
  • 1
Sabby Anandan
  • 5,636
  • 2
  • 12
  • 21
  • Thanks, Sabby. I do want to process the error messages downstream, but not by the same processor as the normal messages. Also, the error messages are custom messages. So it looks like DLQ is not a good fit, right? The sample is helpful for connecting Java code to the channels. The part I'm struggling with, however, is how to let SCDF connect the channels to Kafka topics. Do I simply create another stream from source to error processor? – Remon Sinnema Mar 07 '17 at 19:52
  • It turns out that you don't need to do anything in the stream definition to make this happen. In the stream apps, just use `@EnableBinding` with an interface that adds another channel, as Sabby suggested. For the source, it has to be a `MessageChannel`. For the processor/sink, it has to be a `SubscribableChannel`. Spring Cloud Stream will take care of mapping the channels to Kafka topics; all you need to do is make sure the source and processor/sink use the same name for the channel. There is no need to override the destination via properties. – Remon Sinnema Mar 09 '17 at 09:22