I am not able to access store using TopologyTestDriver if I materialized Store with spring scs properties
spring.cloud.stream.kafka.streams.bindings.process-in-1.consumer.materializedAs: incoming-store
how to query store in junit using…
I need to create streams application in Spring Boot with dynamically generated output topics. I get a list of branch_ids from an external source and have to branch if the record has a branch_id in that list.
Example:
List received from external…
Trying to configure Spring to send bad messages to dead letter queue while using batch mode. But as a result in dlq topic there is nothing.
I use Spring Boot 2.5.3 and Spring Cloud 2020.0.3. This automatically resolves version of…
I am new to Spring Cloud Data Flow, and need to listen for messages on a topic from an external kafka cluster. This external kafka topic in confluent cloud would be my Source that I need to pass on to my Sink application.
I am also using kafka as my…
I am new to KAFKA and would like help
I have 2 applications (springboot), they are identical/copies only with different ports.
http://localhost:8080/
http://localhost:8081/
They are both consumers
the two listen to the topic XXX
I have…
Hi I am trying out the latest Spring cloud stream framework for Kafka. However, for String and Double its working fine but when I try to send a Java POJO class, it throws the below exception.
I have tried various configuration for serialization and…
I am trying to deduplicate the records using input topic as KTable and sinking them to output topic. But the KTable is still sinking the duplicate records to the output topic. Not sure where am I going wrong.
Here is my application.yml
spring:
…
I am trying to migrate to the new functional programming model for Spring Cloud Stream, replacing conditional StreamListener annotations like this
@StreamListener("app-input", condition = "headers['eventName']=='Funded'")
with something…
I am trying to find a way to intercept Spring Cloud Stream 3.1.1 destination binding in order to modify some properties on the fly in my application. I have come to the following code snippet, but it seems it only works for a dynamic binding whereas…
Im struggling to find any documentation on where i can use Spring Cloud Streams that takes a Kafka topic and puts it into a KTable.
Having looked for documentation for example on here…
We are using spring-cloud-stream-binder-kafka-streams:3.1.1 with functional programming. We have set consumer to poll max 5 messages and commit at RECORD level. We want to implement an integration test wherein we stop the consumer followed by a…
I have a Spring Cloud project with a module that binds to messagebus kafka and rabbitmq.
in this module I have a test for kafka:
@ActiveProfiles("test")
@DirtiesContext
@ExtendWith(SpringExtension.class)
@ContextConfiguration(classes =…
We have an annotation which allows us to consume Kafka messages using a Polled Consumer. It's designed for long-running jobs so that one thread is processing the message, while the other remains available for polling to prevent Kafka from thinking…
This consumer didn't need trusted packages:
@Bean
fun berichtStateStoreBuilder() = Consumer> {}
This suddenly does:
@Bean
fun berichtStateStoreBuilder() = Consumer> {
…
I know Kafka Streams allow data to be distributed to multiple topics based on specified predicates, and the Kafka Streams binder supports this using both the @StreamListener and the functional binding approach.
...
// return type KStream,…