0

We have created a kafka stream application which consumes from source topic and writes to a destination topic.

Now we have a requirement where we will have another set of source and destination topics , and the core logic written in the above service would be the same.

So our thought was to implement another kafka stream in the same microservice so that the core logic can re re-used efficiently. But we are not sure if we can do that with kafka streams. And If we can how does the application yml change .

Current application.yml-

cloud:

stream:
  function: process
  bindings:
    process-in-0:
      destination: ${X_TOPIC}
      consumer:
        max-attempts: 1
    process-out-0:
      destination: ${Y_TOPIC}
  kafka:
    streams:
      binder:
        application-id: AA1 
        configuration:
          processing.guarantee: exactly_once
          commit.interval.ms: 1000
          security.protocol: SSL
          ssl:
            truststore.location: ${TRUSTSTORE_LOCATION}
            keystore.location: ${KEYSTORE_LOCATION}
            keystore.password: ${KEYSTORE_PASSWORD:RT}
            truststore.password: ${TRUSTSTORE_PASSWORD:TY}
            truststore.type: JKS
            keystore.type: JKS
        auto-create-topics: false
    binder:
      brokers: ${BROKERS}
Coderz
  • 1
  • Nothing in the JVM is preventing two KStream topologies. But not sure the answer to this via cloud-streams since the binder config is not a list of items – OneCricketeer Feb 11 '21 at 20:32
  • Thanks OneCriketeer . Thats what I am not sure of – Coderz Feb 12 '21 at 00:13
  • Does this section help? https://cloud.spring.io/spring-cloud-stream-binder-kafka/spring-cloud-stream-binder-kafka.html#_multiple_input_bindings Specifically, you can have `process-in-1` and `process-out-1` bindings – OneCricketeer Feb 13 '21 at 00:06

0 Answers0