We have a scenario, where we want to consume data from kafka topics on cluster#1, but create KTable topics (repartition and changelog) on cluster#2.
channel binding -
spring.cloud.stream.bindings.member.destination: member
spring.cloud.stream.bindings.member.consumer.useNativeDecoding: true
spring.cloud.stream.bindings.member.consumer.headerMode: raw
spring.cloud.stream.kafka.streams.bindings.member.consumer.keySerde: org.apache.kafka.common.serialization.Serdes$StringSerde
spring.cloud.stream.kafka.streams.bindings.member.consumer.valueSerde: io.confluent.kafka.streams.serdes.avro.GenericAvroSerde
Create Ktable -
protected KTable<String, GenericRecord> createKTable(String field, KStream<String, GenericRecord> stream, String stateStore) {
return stream
.map((s, genericRecord) -> KeyValue.pair(field, genericRecord))
.groupByKey()
.reduce((oldVal, newVal) -> newVal, Materialized.as(stateStore));
}
So member topic is on cluster#1, but we want to create below ktable topics on different cluster, not sure how to use two different kafka binders in this case -
application-member-store-repartition
application-member-store-changelog