0

I am writing a Kafka streams application using Spring cloud stream kafka streams binder.

While the consumer publishes message to a output topic, there may be an error like Serialization error or Network error.

In this code -

@Bean
public Function<KStream<Object, String>, KStream<Object, String>> process() {
    return (input) -> {
        KStream<Object, String> kt = input.flatMapValues(v -> Arrays.asList(v.toUpperCase().split("\\W+")));
        return kt;
    };
}

Here while producing the message back to the output topic if an error occurs, how to handle it. Is there any mechanism in Kafka streams binder other than RetryTemplate ?

rph
  • 2,104
  • 2
  • 13
  • 24
shreyas.k
  • 181
  • 1
  • 2
  • 15
  • @sobychacko can you please help? – shreyas.k Jun 08 '20 at 03:45
  • Hi, there maybe a few options to get around that in the binder. You can try using `StreamsBuilderFactoryBeanCustomizer` and provide various hooks to handle producer exceptions. See this blog: https://spring.io/blog/2019/12/06/stream-processing-with-spring-cloud-stream-and-apache-kafka-streams-part-5-application-customizations – sobychacko Jun 08 '20 at 13:44

1 Answers1

1

See the Spring for Apache Kafka documentation.

When a deserializer fails to deserialize a message, Spring has no way to handle the problem, because it occurs before the poll() returns. To solve this problem, version 2.2 introduced the ErrorHandlingDeserializer2. This deserializer delegates to a real deserializer (key or value). If the delegate fails to deserialize the record content, the ErrorHandlingDeserializer2 returns a null value and a DeserializationException in a header that contains the cause and the raw bytes. When you use a record-level MessageListener, if the ConsumerRecord contains a DeserializationException header for either the key or value, the container’s ErrorHandler is called with the failed ConsumerRecord. The record is not passed to the listener.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179