1

Trying to configure Spring to send bad messages to dead letter queue while using batch mode. But as a result in dlq topic there is nothing.

I use Spring Boot 2.5.3 and Spring Cloud 2020.0.3. This automatically resolves version of spring-cloud-stream-binder-kafka-parent as 3.1.3.

Here is application.properties:

spring.cloud.stream.bindings.input-in-0.consumer.batch-mode=true
spring.cloud.stream.bindings.input-in-0.content-type=text/plain
spring.cloud.stream.bindings.input-in-0.destination=topic4
spring.cloud.stream.bindings.input-in-0.group=batch4
spring.cloud.stream.bindings.input-in-0.consumer.concurrency=5 

Here is application and batch listener in functional programming model:

@SpringBootApplication
public class DemoKafkaBatchErrorsApplication {

    public static void main(String[] args) {
        SpringApplication.run(DemoKafkaBatchErrorsApplication.class, args);
    }

    @Bean
    public Consumer<List<byte[]>> input() {
        return messages -> {

            for (int i = 0; i < messages.size(); i++) {

                throw new BatchListenerFailedException("Demo: failed to process = ", i);
            }
        };
    }

    @Bean
    public RecoveringBatchErrorHandler batchErrorHandler(KafkaTemplate<String, byte[]> template) {
        DeadLetterPublishingRecoverer recoverer = new DeadLetterPublishingRecoverer(template);
        return new RecoveringBatchErrorHandler(recoverer, new FixedBackOff(2L, 10));
    }
}

Sending to topic:

./kafka-console-producer.sh --broker-list broker:9092 --topic topic4 < input.json

Reading from DLQ:

./kafka-console-consumer.sh --bootstrap-server broker:9092 --topic topic4_ERR --from-beginning --max-messages 100

So after running this app I got nothing in dlq topic, but in console lots of messages like:

Caused by: org.springframework.kafka.listener.BatchListenerFailedException: Demo: failed to process =  @-0
    at com.example.demokafkabatcherrors.DemoKafkaBatchErrorsApplication.lambda$input$0(DemoKafkaBatchErrorsApplication.java:29) ~[classes/:na]
    at org.springframework.cloud.function.context.catalog.SimpleFunctionRegistry$FunctionInvocationWrapper.invokeConsumer(SimpleFunctionRegistry.java:854) ~[spring-cloud-function-context-3.1.3.jar:3.1.3]
    at org.springframework.cloud.function.context.catalog.SimpleFunctionRegistry$FunctionInvocationWrapper.doApply(SimpleFunctionRegistry.java:643) ~[spring-cloud-function-context-3.1.3.jar:3.1.3]
    at org.springframework.cloud.function.context.catalog.SimpleFunctionRegistry$FunctionInvocationWrapper.apply(SimpleFunctionRegistry.java:489) ~[spring-cloud-function-context-3.1.3.jar:3.1.3]
    at org.springframework.cloud.stream.function.PartitionAwareFunctionWrapper.apply(PartitionAwareFunctionWrapper.java:77) ~[spring-cloud-stream-3.1.3.jar:3.1.3]
    at org.springframework.cloud.stream.function.FunctionConfiguration$FunctionWrapper.apply(FunctionConfiguration.java:727) ~[spring-cloud-stream-3.1.3.jar:3.1.3]
    at org.springframework.cloud.stream.function.FunctionConfiguration$FunctionToDestinationBinder$1.handleMessageInternal(FunctionConfiguration.java:560) ~[spring-cloud-stream-3.1.3.jar:3.1.3]
    at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:56) ~[spring-integration-core-5.5.2.jar:5.5.2]
    ... 27 common frames omitted

What am I doing wrong?

UPD: According to Gary's answer I did these changes:

    @Bean
    public ListenerContainerCustomizer<AbstractMessageListenerContainer<?, ?>> customizer(BatchErrorHandler handler) {
        return ((container, destinationName, group) -> container.setBatchErrorHandler(handler));
    }

    @Bean
    public BatchErrorHandler batchErrorHandler(KafkaOperations<String, byte[]> kafkaOperations) {
        DeadLetterPublishingRecoverer recoverer = new DeadLetterPublishingRecoverer(kafkaOperations,
                (cr, e) -> new TopicPartition(cr.topic() + "_ERR", 0));
        return new RecoveringBatchErrorHandler(recoverer, new FixedBackOff(2L, 3));
    }

and all is working like a charm

Alex
  • 316
  • 2
  • 13

1 Answers1

1

When using spring-cloud-stream, the container is not created by Boot's container factory, it is created by the binder; the error handler @Bean won't be automatically wired in.

You have to configure a ListenerContainerCustomizer @Bean instead.

Example here: Can I apply graceful shutdown when using Spring Cloud Stream Kafka 3.0.3.RELEASE?

Gary Russell
  • 166,535
  • 14
  • 146
  • 179