0

I have a Kafka consumer. If the consumer fails to read any message I need to send that to Dead Letter Topic. I am using Spring cloud Kafka stream, I enabled DLQ in configuration like this.

spring:
  cloud:
    stream:
      function:
        definition: myConsumer
      kafka:
        binder:
          consumer-properties:
            auto.offset.reset: earliest
        bindings:
           myConsumer-in-0:
            consumer:
              **enableDlq: true
              dlqName: api-consumer-count-request-DLT** ( this is in different cluster, this has its own broker list.)
      bindings:
        myConsumer-in-0:
          binder: myBinder
          destination: api-consumer
          group: count-execution
      binders:
      myBinder:
          type: kafka
          environment:
            spring:
              cloud:
                stream:
                  kafka:
                    binder:
                      brokers:{regular broker list for consumer}

But My regular Consumer topic is in a different cluster from DLQ topic. Is it possible to achieve this? If yes can you guide me through the configurations?

Pradeep Charan
  • 653
  • 2
  • 7
  • 28

1 Answers1

0

No; this is not currently possible; the binder will only publish to the same Kafka cluster.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • See https://stackoverflow.com/questions/75066568/spring-cloud-stream-binder-kafka-dead-letter-topic-in-different-cluster/75071829#75071829 – Gary Russell Jan 10 '23 at 14:59