6

I am using Spring Cloud Stream Kafka binder to consume messages from Kafka. I am able to make my sample work with a single Kafka Binder as below

spring:
  cloud:
    stream:
      kafka:
        binder:
          consumer-properties: {enable.auto.commit: true}
          auto-create-topics: false
          brokers: <broker url>
      bindings:
        consumer:
          destination: some-topic
          group: testconsumergroup
          consumer:
            concurrency: 1
            valueSerde: JsonSerde
        producer:
          destination: some-other-topic
          producer:
            valueSerde: JsonSerde

Note that both the bindings are to the same Kafka Broker here. However, I have a situation where I need to publish to a topic in some Kafka Cluster and also consume from another topic in a different Kafka Cluster. How should I change my configuration to be able to bind to different Kafka Clusters?

I tried something like this

spring:
  cloud:
    stream:
      binders:
        defaultbinder:
          type: kafka
          environment:
            spring.cloud.stream.kafka.streams.binder.brokers: <cluster1-brokers>
        kafka1:
          type: kafka
          environment:
            spring.cloud.stream.kafka.streams.binder.brokers: <cluster2-brokers>
      bindings:
        consumer:
          binder: kafka1
          destination: some-topic
          group: testconsumergroup
          consumer:
            concurrency: 1
            valueSerde: JsonSerde
        producer:
          binder: defaultbinder
          destination: some-topic
          producer:
            valueSerde: JsonSerde      
      kafka:
        binder:
          consumer-properties: {enable.auto.commit: true}
          auto-create-topics: false
          brokers: <cluster1-brokers>

and

spring:
  cloud:
    stream:
      binders:
        defaultbinder:
          type: kafka
          environment:
            spring.cloud.stream.kafka.streams.binder.brokers: <cluster1-brokers>
        kafka1:
          type: kafka
          environment:
            spring.cloud.stream.kafka.streams.binder.brokers: <cluster2-brokers>
      kafka:
        bindings:
          consumer:
            binder: kafka1
            destination: some-topic
            group: testconsumergroup
            consumer:
              concurrency: 1
              valueSerde: JsonSerde
          producer:
            binder: defaultbinder
            destination: some-topic
            producer:
              valueSerde: JsonSerde      
      kafka:
        binder:
          consumer-properties: {enable.auto.commit: true}
          auto-create-topics: false
          brokers: <cluster1-brokers>

But both of them dont seem to work. The first configuration seems to be invalid. For the second configuration I get the below error

Caused by: java.lang.IllegalStateException: A default binder has been requested, but there is more than one binder available for 'org.springframework.cloud.stream.messaging.DirectWithAttributesChannel' : kafka1,defaultbinder, and no default binder has been set.

I am using the dependency 'org.springframework.cloud:spring-cloud-starter-stream-kafka:3.0.1.RELEASE' and Spring Boot 2.2.6

Please let me know how to configure multiple bindings for Kafka using Spring Cloud Stream

Update

Tried this configuration below

spring:
  cloud:
    stream:
      binders:
        kafka2:
          type: kafka
          environment:
            spring.cloud.stream.kafka.binder.brokers: <cluster2-brokers>
        kafka1:
          type: kafka
          environment:
            spring.cloud.stream.kafka.binder.brokers: <cluster1-brokers>
      bindings:
        consumer:
          destination: <some-topic>
          binder: kafka1
          group: testconsumergroup
          content-type: application/json
          nativeEncoding: true
          consumer:
            concurrency: 1
            valueSerde: JsonSerde
        producer:
          destination: some-topic
          binder: kafka2
          contentType: application/json
          nativeEncoding: true
          producer:
            valueSerde: JsonSerde

The Message Streams and EventHubBinding is as follows

public interface MessageStreams {
  String PRODUCER = "producer";
  String CONSUMER = "consumer;

  @Output(PRODUCER)
  MessageChannel producerChannel();

  @Input(CONSUMER)
  SubscribableChannel consumerChannel()
}

@EnableBinding(MessageStreams.class)
public class EventHubStreamsConfiguration {
}

My Producer class looks like below

@Component
@Slf4j
public class EventPublisher {
  private final MessageStreams messageStreams;

  public EventPublisher(MessageStreams messageStreams) {
    this.messageStreams = messageStreams;
  }

  public boolean publish(CustomMessage event) {
    MessageChannel messageChannel = getChannel();
    MessageBuilder messageBuilder = MessageBuilder.withPayload(event);
    boolean messageSent = messageChannel.send(messageBuilder.build());
    return messageSent;
  }

  protected MessageChannel getChannel() {
    return messageStreams.producerChannel();
  }
}

And Consumer class looks like below

@Component
@Slf4j
public class EventHandler {
  private final MessageStreams messageStreams;

  public EventHandler(MessageStreams messageStreams) {
    this.messageStreams = messageStreams;
  }

  @StreamListener(MessageStreams.CONSUMER)
  public void handleEvent(Message<CustomMessage> message) throws Exception 
  {
    // process the event
  }

  @Override
  @ServiceActivator(inputChannel = "some-topic.testconsumergroup.errors")
  protected void handleError(ErrorMessage errorMessage) throws Exception {
    // handle error;
  }
}

I am getting the below error while trying to publish and consume the messages from my test.

Dispatcher has no subscribers for channel 'application.producer'.; nested exception is org.springframework.integration.MessageDispatchingException: Dispatcher has no subscribers, failedMessage=GenericMessage [payload=byte[104], headers={contentType=application/json, timestamp=1593517340422}]

Am i missing anything? For a single cluster, i am able to publish and consume messages. The issue is only happening with multiple cluster bindings

java_geek
  • 17,585
  • 30
  • 91
  • 113
  • 1
    Tried this option? https://stackoverflow.com/a/60218943/1927543 – Karthikeyan Jun 28 '20 at 12:30
  • Did you take a look at this sample? https://github.com/spring-cloud/spring-cloud-stream-samples/tree/master/multi-binder-samples/multi-binder-two-kafka-clusters – sobychacko Jun 29 '20 at 17:05

0 Answers0