I have a Spring Cloud Stream project using the Kafka binder and I want to add retry functionality, I am trying to use RetryTemplate and specify certain exceptions that I want to handle, but because of any exception is wrapped by…
I have a requirement where I need to create and delete Kafka topics programmatically. org.springframework.cloud.stream.binding.BinderAwareChannelResolver . resolveDestination(String channelName) can create a topic, but this is deprecated.
Also,…
We're using SubscribableChannel, MessageChannel and PollableMessageSource and configured using @EnableBinding and @StreamListner. Now we need to migrate to functional approach. SubscribableChannel and MessageChannel can be converted using Consumer…
I'm practicing with spring-cloud-stream with kafka-binding with the following application:
spring:
cloud:
function:
definition: single;react
stream:
bindings:
single-in-0:
destination: single-in
…
I have followed the below documentation and I have a producer and Consumer working perfectly fine with Kinesis Stream. I would like to understand how to handle the ERROR in Producer (Source) and Consumer (Processor) in case of any exception…
I'm trying to integrate a library based on spring-boot cloud stream with Kafka within non-spring application.
When this library is loaded within another Spring application everything works.
When I try to initialize application context with…
I have an Spring Integration flow bean (configured via Java DSL) which processes the messages from kafka queue message channel binded with Spring CloudStream.
The source of kafka message is an external application so what I really want to…
I have a service based on webFlux and will consume then produce message from a kafka topic.
My code is just like this
@Bean
public Function, Flux> reactiveUpperCase() {
return flux -> flux.map(val ->…
I am working on a Kafka Streams application built with Spring Cloud Stream. In this application I need to:
Consume a continuous stream of messages that can be retrieved at a later time.
Persist a list of the message IDs matching some criteria.
In a…
I have been working on creating a simple custom processor in Scala for Spring Cloud Data Flow and have been running into issues with sending/receiving data from/to starter applications. I have been unable to see any messages propagating through the…
I am using Spring Cloud Stream's DLQ feature with the Kafka binder. When message processing fails, the message is sent to the DLQ as expected, however, I want to be able to modify the message being sent to the DLQ to include some extra diagnostic…
I am writing a Kafka streams application using Spring cloud stream kafka streams binder.
While the consumer publishes message to a output topic, there may be an error like Serialization error or Network error.
In this code -
@Bean
public…
Consider this scenario: Kafka topic with 6 partitions. Spring Java Kafka Consumer Application with 6 replicas so that each of them deals with one of the partitions.
The problem I'm facing is the processing of each message in the consumer takes a…
We have Spring Stream Listener that operates in BATCH mode.The processing time for each batch is about 3 ms.Following is our configuration:
allow.auto.create.topics = true
auto.commit.interval.ms = 100
auto.offset.reset =…
I have a spring cloud application, using spring reactive core listening to two topics and each has 10 partitions.
In the consumer I am simply reading the message and printing the topic, partition and offset, Some messages are not getting read.
I…