We have created a kafka stream application which consumes from source topic and writes to a destination topic.
Now we have a requirement where we will have another set of source and destination topics , and the core logic written in the above service would be the same.
So our thought was to implement another kafka stream in the same microservice so that the core logic can re re-used efficiently. But we are not sure if we can do that with kafka streams. And If we can how does the application yml change .
Current application.yml-
cloud:
stream:
function: process
bindings:
process-in-0:
destination: ${X_TOPIC}
consumer:
max-attempts: 1
process-out-0:
destination: ${Y_TOPIC}
kafka:
streams:
binder:
application-id: AA1
configuration:
processing.guarantee: exactly_once
commit.interval.ms: 1000
security.protocol: SSL
ssl:
truststore.location: ${TRUSTSTORE_LOCATION}
keystore.location: ${KEYSTORE_LOCATION}
keystore.password: ${KEYSTORE_PASSWORD:RT}
truststore.password: ${TRUSTSTORE_PASSWORD:TY}
truststore.type: JKS
keystore.type: JKS
auto-create-topics: false
binder:
brokers: ${BROKERS}