0

I have written a Spring cloud stream message producer which is successfully posting the payload to the topic on my local bootstrap server. I want to post the payload onto the topic that I have created on Confluent Cloud. What configuration changes do I need to make? Below is the configuration for localhost which is working.


spring:
  cloud:
    stream:
      bindings:
        output:
          destination: ordersTopic
          content-type: application/json
      kafka:
        binder:
          zkNodes: localhost
          brokers: localhost
      default-binder: kafka
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
zilcuanu
  • 3,451
  • 8
  • 52
  • 105

1 Answers1

0

You no longer need zkNodes - modern Kafka clients only talk directly to the broker and don't talk to zookeeper. What version of spring-cloud-stream are you using? That property (zkNodes) is no longer present in recent versions.

You need to set the brokers property to match whatever is configured as the host/port of the advertised listeners in the broker's server.properties files.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • How about the API Access key and Secret keys. Is there a document for connecting to Confluent.io. I am using 3.0.9.RELEASE version of spring-cloud-stream – zilcuanu Dec 22 '20 at 20:44
  • That's outside of Spring's scope; sorry; we just delegate to the kafka-clients; refer to the Kafka and Confluent documentation. – Gary Russell Dec 22 '20 at 20:46
  • 2
    I found some docs [here](https://docs.confluent.io/5.5.0/cloud/using/config-client.html) looks like you add them as properties; you can set arbitray Kafka properties via the `...kafka.binder.configuration` property; e.g. `...kafka.binder.configuration.foo.bar: baz`. There is also a `...jaas` property, in case you need that. – Gary Russell Dec 22 '20 at 20:56