Questions tagged [spring-cloud-stream-binder-kafka]

An Apache Kafka Binder for Spring-Cloud Streams.

Documentation: https://cloud.spring.io/spring-cloud-stream-binder-kafka/spring-cloud-stream-binder-kafka.html

Related:

472 questions
2
votes
1 answer

How to do this topology in Spring Cloud Kafka Streams in function style?

var streamsBuilder = new StreamsBuilder(); KStream inputStream = streamsBuilder.stream("input_topic"); KStream upperCaseString = inputStream.mapValues((ValueMapper)…
2
votes
2 answers

Spring Cloud Stream Kafka retries 10 times the maxAttempts

I've been trying to implement retry logic for Spring cloud stream kafka such that if an exception is throw when producing an event to the topic sample-topic, It retries two more time. I added in the following configuration to the…
2
votes
1 answer

Spring Cloud stream "Found no committed offset"

I am using Spring Cloud Stream with Kafka binders. I could see that once the application starts it throws INFO level logs after every minute for all the input bindings configured in my application. Configuration in the…
2
votes
1 answer

Spring cloud stream (Kafka) autoCreateTopics not working

I am using Spring Cloud stream with Kafka binder. To disable the auto-create topics I referred to this- How can I configure a Spring Cloud Stream (Kafka) application to autocreate the topics in Confluent Cloud?. But, it seems that setting this…
2
votes
0 answers

Message processing guarantees with spring-cloud-stream-binder-kafka functional binding

Given default configuration and this binding @Bean public Function>, Flux>> process() { return input -> input .map(message -> { // simplified return MessageBuilder.build(); }); } Is there any…
B.Gen.Jack.O.Neill
  • 8,169
  • 12
  • 51
  • 79
2
votes
0 answers

SCS kafka consumer attempts to acquire info from a partition that is no longer assigned to it

spring-cloud-stream-binder-kafka 3.0.9-RELEASE spring-boot 2.2.13.RELEASE Hi, we have a project using Spring Cloud Stream with kafka and we are having a problem in reconnecting the consumers when the broker nodes are down for a period of time. The…
2
votes
1 answer

Error Handling in Kafka Producer while using Spring Cloud Stream Kafka Binder

I am having an rest post endpoint which consumes data and writes to Kafka using Spring Cloud Stream Kafka Binder. Right now we are not having any error handling in place. But we want to make this endpoint fault tolerant by adding an extra check…
2
votes
1 answer

Publish multiple messages in batch processing function with Spring cloud stream Kafka Binder

I am looking for an example to create a functional style processor using spring cloud stream kafka binder (without Kafka Streams) that can consume a batch of n messages from one topic and publish m messages to another topic (m < n). I have tried the…
2
votes
1 answer

Avro Deserialization exception handling with Spring Cloud Stream

I have an application using Spring Cloud Stream and Spring Kafka, which processes Avro messages. The application works fine, but now I'd like to add some error handling. The Goal: I would like to catch deserialization exceptions, build a new object…
2
votes
0 answers

Spring cloud Stream kafka binder - Auto-provisioning of topics

Spring-cloud-stream-binder-kafka 3.0.9.RELEASE Would it be possible to disable the provisioning phase during application start-up? We have some problems when starting applications with several kafka producers configured, in which during the…
2
votes
0 answers

How to represent custom spring cloud stream kafka binder configuration in Java auto configuration

I have the following configuration with custom kafka binder configurations spring.cloud.stream: bindings: inputBinding1: binder: kafka1 destination: destination1 inputBinding2: binder: kafka2 destination:…
2
votes
1 answer

Routing conditional with spring cloud streams functional

I have some issues after the old imperative programming type was deprecated. I have two microservices (one as publisher and the other as subscriber) and in the old way, with the annotation @StreamListener(target = "events", condition =…
2
votes
1 answer

Is there any method to retry within maxAttempts times using Acknowledge.nack for spring-cloud-stream-binder-kafka?

I am trying to use consuming batches in kafka, and I found the document said retry is not supported as follows. Retry within the binder is not supported when using batch mode, so maxAttempts will be overridden to 1. You can configure a…
2
votes
2 answers

Spring cloud Stream - kafka - Null Acknowledgement Header

I want to manually Commit the offset using spring cloud stream - only when the message processing is successful. Here is my code - application.yml & Handler Class public void process(Message message) { …
2
votes
2 answers

Kafka Streams: Define multiple Kafka Streams using Spring Cloud Stream for each set of topics

I am trying to do a simple POC with Kafka Streams. However I am getting exception while starting the application. I am using Spring-Kafka, Kafka-Streams 2.5.1 with Spring boot 2.3.5 Kafka stream configuration @Configuration public class…