Questions tagged [spring-cloud-stream-binder-kafka]

An Apache Kafka Binder for Spring-Cloud Streams.

Documentation: https://cloud.spring.io/spring-cloud-stream-binder-kafka/spring-cloud-stream-binder-kafka.html

Related:

472 questions
1
vote
0 answers

How to send messages to multiple topics using DSL in Apache Kafka

I am new to Kafka. I am using Spring Cloud Stream Kafka in my project, also I am writing code in functional way. I want to consume a message from single input topic, build different models from message and publish each model to a different topic…
1
vote
1 answer

Spring cloud streams kafka routing,expression overgrowth

I have an application that tries to change the @StreamListener approach to a functional approach, as @StreamListener is deprecated and might be removed someday soon. The old approach allowed us to use the following conception: @StreamListener(target…
1
vote
1 answer

What is the difference between spring-kafka and Apache-Kafka-Streams-Binder regarding the interaction with Kafka Stream API?

My understanding was that spring-kafka was created to interact with Kafka Client APIs, and later on, spring-cloud-stream project was created for "building highly scalable event-driven microservices connected with shared messaging systems", and this…
1
vote
1 answer

producer.headerMode default value

probably anyone know which value is default for spring.cloud.stream.bindings..producer.header-mode in spring-cloud-stream-kafka-binder? The problem is because in spring-cloud stream documentation we have Default: Depends on the binder…
1
vote
1 answer

/actuator/health does not detect stopped binders in Spring Cloud Stream

We are using Spring Cloud Streams with multiple bindings based on Kafka Streams binders. The output of /actuator/health correctly lists all our bindings and their state (RUNNING) - see example below. Our expectation was, when a binding is stopped…
1
vote
1 answer

Spring cloud kafka Stream Binder transactionIdPrefix send 2 messages to outbound topic

We have a requirement where we are consuming messages from one topic then there is some enrichment happening and then we are publishing the message to another topic. below are the events Consumer - Consume the message Enrichment - Enriched the…
1
vote
0 answers

Kafka consumer LeaveGroup request to coordinator

I have one interesting scenario. Seems like when there are no new topics to pick up (at least that's what I think is happening), my consumer suddenly shuts down. I am using Kotlin + Spring Boot Kafka Producer and Consumer. My consumer is configured…
1
vote
0 answers

Kafka Streams - aggregate with object as key

This code below works, but it would be nice if I don't need to map the key at the beginning. @Bean public Function< KStream, KStream> aggregate() { return reviews…
1
vote
1 answer

How does Kafka Schema registration happen in Spring Cloud Stream?

I am trying to understand how to use Spring Cloud Streams with the Kafka Binder. Currently, I am trying to register an AVRO schema with my Confluent Schema Registry and send messages to a topic. I am unable to understand how the schema registration…
1
vote
1 answer

Retry max 3 times when consuming batches in Spring Cloud Stream Kafka Binder

I am consuming batches in kafka, where retry is not supported in spring cloud stream kafka binder with batch mode, there is an option given that You can configure a SeekToCurrentBatchErrorHandler (using a ListenerContainerCustomizer) to achieve…
1
vote
0 answers

Spring Cloud Stream - StreamBridge error on send

I'm developing an application with Spring Cloud Stream 3.1.3 and binder kafka with schema registry. This is the class I wrote for the Producer @Slf4j @EnableAutoConfiguration @Component public class Producer { private static final String…
1
vote
0 answers

(Functional) Spring cloud stream custom partition key implementation for publishing list of messages to single Kafka topic

I have created an application which uses Function based Spring cloud stream library to publish list of messages to single Kafka topic. I want to know how can we set partition key for each message in the list. Basically , each message should have its…
1
vote
1 answer

DLQ, bounded retry, and EOS when producing to multiple topics using Spring Cloud Stream

I am trying to write a transform function which will consume an input from one topic and produce two outputs to topics Left and Right. Moreover I need this to take place in a transaction so that if the application fails to produce a message to…
1
vote
0 answers

Kafka Producer cannot send to a topic with transactional producer

I am using Spring Cloud Stream with Kafka binder. I have configured a transactional producer. However, I have one topic with a large number of partitions (135), and it seems to cause trouble with this approach. I am using StreamBridge to produce…
1
vote
0 answers

How to get topic name in stream-kafka-binder?

I'm following this example: link So application.yml file looks like this: spring.cloud.stream.kafka.streams.binder.configuration.commit.interval.ms: 1000 spring.cloud.stream.kafka.streams.binder.configuration: default.key.serde:…