I am new to Kafka. I am using Spring Cloud Stream Kafka in my project, also I am writing code in functional way.
I want to consume a message from single input topic, build different models from message and publish each model to a different topic…
I have an application that tries to change the @StreamListener approach to a functional approach, as @StreamListener is deprecated and might be removed someday soon.
The old approach allowed us to use the following conception:
@StreamListener(target…
My understanding was that spring-kafka was created to interact with Kafka Client APIs, and later on, spring-cloud-stream project was created for "building highly scalable event-driven microservices connected with shared messaging systems", and this…
probably anyone know which value is default for spring.cloud.stream.bindings..producer.header-mode in spring-cloud-stream-kafka-binder?
The problem is because in spring-cloud stream documentation we have
Default: Depends on the binder…
We are using Spring Cloud Streams with multiple bindings based on Kafka Streams binders.
The output of /actuator/health correctly lists all our bindings and their state (RUNNING) - see example below.
Our expectation was, when a binding is stopped…
We have a requirement where we are consuming messages from one topic then there is some enrichment happening and then we are publishing the message to another topic. below are the events
Consumer - Consume the message
Enrichment - Enriched the…
I have one interesting scenario. Seems like when there are no new topics to pick up (at least that's what I think is happening), my consumer suddenly shuts down.
I am using Kotlin + Spring Boot Kafka Producer and Consumer. My consumer is configured…
This code below works, but it would be nice if I don't need to map the key at the beginning.
@Bean
public Function<
KStream, KStream>
aggregate() {
return reviews…
I am trying to understand how to use Spring Cloud Streams with the Kafka Binder.
Currently, I am trying to register an AVRO schema with my Confluent Schema Registry and send messages to a topic.
I am unable to understand how the schema registration…
I am consuming batches in kafka, where retry is not supported in spring cloud stream kafka binder with batch mode, there is an option given that You can configure a SeekToCurrentBatchErrorHandler (using a ListenerContainerCustomizer) to achieve…
I'm developing an application with Spring Cloud Stream 3.1.3 and binder kafka with schema registry.
This is the class I wrote for the Producer
@Slf4j
@EnableAutoConfiguration
@Component
public class Producer {
private static final String…
I have created an application which uses Function based Spring cloud stream library to publish list of messages to single Kafka topic. I want to know how can we set partition key for each message in the list. Basically , each message should have its…
I am trying to write a transform function which will consume an input from one topic and produce two outputs to topics Left and Right. Moreover I need this to take place in a transaction so that if the application fails to produce a message to…
I am using Spring Cloud Stream with Kafka binder. I have configured a transactional producer. However, I have one topic with a large number of partitions (135), and it seems to cause trouble with this approach. I am using StreamBridge to produce…
I'm following this example:
link
So application.yml file looks like this:
spring.cloud.stream.kafka.streams.binder.configuration.commit.interval.ms: 1000
spring.cloud.stream.kafka.streams.binder.configuration:
default.key.serde:…