I've been trying to implement retry logic for Spring cloud stream kafka such that if an exception is throw when producing an event to the topic sample-topic, It retries two more time.
I added in the following configuration to the…
I am using Spring Cloud Stream with Kafka binders. I could see that once the application starts it throws INFO level logs after every minute for all the input bindings configured in my application.
Configuration in the…
I am using Spring Cloud stream with Kafka binder. To disable the auto-create topics I referred to this- How can I configure a Spring Cloud Stream (Kafka) application to autocreate the topics in Confluent Cloud?. But, it seems that setting this…
Given default configuration and this binding
@Bean
public Function>, Flux>> process() {
return input -> input
.map(message -> {
// simplified
return MessageBuilder.build();
});
}
Is there any…
spring-cloud-stream-binder-kafka 3.0.9-RELEASE
spring-boot 2.2.13.RELEASE
Hi, we have a project using Spring Cloud Stream with kafka and we are having a problem in reconnecting the consumers when the broker nodes are down for a period of time.
The…
I am having an rest post endpoint which consumes data and writes to Kafka using Spring Cloud Stream Kafka Binder. Right now we are not having any error handling in place. But we want to make this endpoint fault tolerant by adding an extra check…
I am looking for an example to create a functional style processor using spring cloud stream kafka binder (without Kafka Streams) that can consume a batch of n messages from one topic and publish m messages to another topic (m < n).
I have tried the…
I have an application using Spring Cloud Stream and Spring Kafka, which processes Avro messages. The application works fine, but now I'd like to add some error handling.
The Goal: I would like to catch deserialization exceptions, build a new object…
Spring-cloud-stream-binder-kafka 3.0.9.RELEASE
Would it be possible to disable the provisioning phase during application start-up?
We have some problems when starting applications with several kafka producers configured, in which during the…
I have the following configuration with custom kafka binder configurations
spring.cloud.stream:
bindings:
inputBinding1:
binder: kafka1
destination: destination1
inputBinding2:
binder: kafka2
destination:…
I have some issues after the old imperative programming type was deprecated.
I have two microservices (one as publisher and the other as subscriber) and in the old way, with the annotation @StreamListener(target = "events", condition =…
I am trying to use consuming batches in kafka, and I found the document said retry is not supported as follows.
Retry within the binder is not supported when using batch mode, so maxAttempts will be overridden to 1. You can configure a…
I want to manually Commit the offset using spring cloud stream - only when the message processing is successful.
Here is my code - application.yml & Handler Class
public void process(Message> message) {
…
I am trying to do a simple POC with Kafka Streams. However I am getting exception while starting the application. I am using Spring-Kafka, Kafka-Streams 2.5.1 with Spring boot 2.3.5
Kafka stream configuration
@Configuration
public class…