I need the headers in every topic for object mapping purposes.
Currently when I add a header to a record in a processor, then the header will be present in the target output topic, but not in changelog topics used by state stores.
How can the…
I'm using spring cloud stream to modify a Kafka Topic and write the resulting data with ".toTable()" to a table. In the application.yaml I will set the input and output bindings.
This works fine on the Kafka cluster but not with my current test…
I have a single spring boot application which needs to consume data from topics belonging to two different Kafka clusters having separate SSL configs(keystore and trusstore files & locations).
Is it possible to configure this in…
I’m having some issues with my Kafka Streams implementation in production. I’ve implemented a function that takes a KTable and a KStream, and yields another KTable with aggregated results based on the join of these two inputs. The idea is to iterate…
The KafkaBinderConfigurationProperties class has dedicated properties for the producer and consumer. However, there is no admin properties.
While comparing with KafkaProperties which has the dedicated properties for producer, consumer and admin.
I…
I am connecting kafka with Spring boot application using spring cloud stream Functinal style programming model.
I need to stop the application if the application is not able to connect to kafka binder.
Following are the properties I am using
Spring…
When i migrated spring-cloud-starter-stream-kafka to the version 3.2.5 i had many annotations that are duplicated, that i'm correcting
old consumer:
@StreamListener(value = KafkaStreams.INPUT_CATALOG, condition = "headers['catalog-code'] ==…
I want to enforce a certain behaviour that's why I've set transactional.id.expiration.ms: 1000 via strimzi and verified that it will be picked up by the broker too.
My expectation is that the producer ids expire after 1 second too, but this doesn't…
I have spring-cloud-stream project that use kafka binder.
Application consumes messages in batch mode. I need to filter consumed records by specific header. In this case i use BatchInterceptor:
@Bean
public…
I am using "spring-cloud-stream-binder-kafka" for my consumer, having AVRO topic.
it's new consumer with new consumer group. After running the application I am getting this log "Found no committed offset for partition 'topic-name-x'". I read that it…
I need to access the retry attempt number in spring cloud stream kafka transactional retry so that for a particular exception, based on the retry attempt number i can post the outcome to different topic
The default setup with kafka-streams and spring-boot-actuator includes adding the state of the stream threads into the health check endpoint. This is great, but if my application is using a state store, and that store is still in the process of…
I am trying to create a spring boot application using spring cloud kafka stream which reads input from kafka cluster 1 and send it to kafka cluster 2 using single kafkastream application.
I am getting following exception during…