We intend to use spring-cloud-bus with kafka as a binder in order to notify the services after configuration changes. All works fine but we have this following exception during service shutdown :
WARN 48116 --- [extShutdownHook]…
I'm trying to join data from two topics person and address where one person can have multiple addresses. The data published into the topics look like the following:
//person with id as key
{"id": "123", "name": "Tom Tester"}
//addresses with id as…
I have configured 3 brokers in Kafka running on different ports .I am using spring cloud stream kafka
brokers: localhost:9092,localhost:9093,localhost:9094.
I am creating a data pipeline that gets continuous stream of data .I am storing stream…
I want get the offset and partition information after i produce a message to kafka topic.
I read through spring cloud stream kafka binding document and found that that can be achieved by fecthing RECORD_METADATA kafka header.
From Spring…
As someone new to Spring but with a streams processing background, I'm pretty confused about how I should be testing processors written in Spring Cloud Stream. The testing docs (written for 2.2.0 but seemingly the most recent, so still valid for…
I have followed the below documentation and I have a producer and Consumer working perfectly fine with Kinesis Stream. I would like to understand how to handle the ERROR in Producer (Source) in case of any exception happens.
I have tried below…
I was trying to run a Spring boot Kafka Stream example from https://cloud.spring.io/spring-cloud-stream-binder-kafka/spring-cloud-stream-binder-kafka.html#_usage_2 site.
Am able to successfully build it. But while running it getting error as shown…
I am trying to connect to a kafka cluster through SASL_SSL protocol with jaas config as follows:
spring:
cloud:
stream:
bindings:
binding-1:
binder: kafka-1-with-ssl
destination:
…
Documentation is pretty straight forward which suggests exposing a Bean of type KafkaBindingRebalanceListener and onPartitiosnAssigned method would be called internally. I'm trying to do the same and somehow while spring framework creates its…
My Spring Boot 2.3.1 app with SCS Hoshram.SR6 was using the Kafka Streams Binder. I needed to add a Kafka Producer that would be used in another part of the application so I added the kafka binder. The problem is the producer is not working,…
Basically I am consuming Messages from spring cloud stream kafka and inserting it into the MongoDB
My code works fine if my mongo cluster is up
I have 2 problems In case My Mongo Instance is down
auto commit of cloud stream is disabled…
The issue is pretty much the same as https://github.com/spring-cloud/spring-cloud-stream-binder-kafka/issues/455 but for the Functional approach
@Bean
public Consumer> fooConsumer() {
return message -> {
…
I'm trying to perform a flatTransform in a Spring Cloud Kafka Streams app. But I'm not sure where exactly to put the KafkaStreamsStateStore annotation. At the moment I'm getting the error: Invalid topology: StateStore activeInstruments is not added…
I am working on Kafka Streams using Spring Cloud Stream. In the message processing application, there may be a chance that it will produce an error. So the message should not be commited and retried again.
My application method -
@Bean
public…
I am working on Spring cloud stream kafka streams binder. In my consumer bean method, I want to return KStream with List of String as value -
@Bean
public Function, KStream>> method() {
return…