Questions tagged [spring-cloud-stream-binder-kafka]

An Apache Kafka Binder for Spring-Cloud Streams.

Documentation: https://cloud.spring.io/spring-cloud-stream-binder-kafka/spring-cloud-stream-binder-kafka.html

Related:

472 questions
0
votes
1 answer

Spring Cloud Stream and Spring RetryTemplate handling of nested exception

I have a Spring Cloud Stream project using the Kafka binder and I want to add retry functionality, I am trying to use RetryTemplate and specify certain exceptions that I want to handle, but because of any exception is wrapped by…
0
votes
1 answer

Spring cloud stream create and delete topics programmatically

I have a requirement where I need to create and delete Kafka topics programmatically. org.springframework.cloud.stream.binding.BinderAwareChannelResolver . resolveDestination(String channelName) can create a topic, but this is deprecated. Also,…
0
votes
1 answer

Is there any way to poll messages from topic using Spring Cloud Function approach?

We're using SubscribableChannel, MessageChannel and PollableMessageSource and configured using @EnableBinding and @StreamListner. Now we need to migrate to functional approach. SubscribableChannel and MessageChannel can be converted using Consumer…
0
votes
1 answer

Spring cloud stream all messages sent to DLQ

I'm practicing with spring-cloud-stream with kafka-binding with the following application: spring: cloud: function: definition: single;react stream: bindings: single-in-0: destination: single-in …
0
votes
1 answer

Spring Cloud Kinesis Binder How to Handle ERROR for Producer and Consumer - As per Documentation it is Not working

I have followed the below documentation and I have a producer and Consumer working perfectly fine with Kinesis Stream. I would like to understand how to handle the ERROR in Producer (Source) and Consumer (Processor) in case of any exception…
0
votes
1 answer

spring-boot cloud stream with Kafka as a stand alone library

I'm trying to integrate a library based on spring-boot cloud stream with Kafka within non-spring application. When this library is loaded within another Spring application everything works. When I try to initialize application context with…
0
votes
1 answer

Spring integration Kafka MessaheChannel Thread?

I have an Spring Integration flow bean (configured via Java DSL) which processes the messages from kafka queue message channel binded with Spring CloudStream. The source of kafka message is an external application so what I really want to…
0
votes
1 answer

How to increase the topic consumer throughput by using `Function, Flux>`?

I have a service based on webFlux and will consume then produce message from a kafka topic. My code is just like this @Bean public Function, Flux> reactiveUpperCase() { return flux -> flux.map(val ->…
0
votes
1 answer

Kafka Streams and writing to the state store

I am working on a Kafka Streams application built with Spring Cloud Stream. In this application I need to: Consume a continuous stream of messages that can be retrieved at a later time. Persist a list of the message IDs matching some criteria. In a…
Boon
  • 1,073
  • 1
  • 16
  • 42
0
votes
2 answers

Spring Cloud Data Flow Custom Scala Processor unable to send/receive data from Starter Apps (SCDF 2.5.1 & Spring Boot 2.2.6)

I have been working on creating a simple custom processor in Scala for Spring Cloud Data Flow and have been running into issues with sending/receiving data from/to starter applications. I have been unable to see any messages propagating through the…
0
votes
1 answer

Spring Cloud Stream - modify DLQ messages

I am using Spring Cloud Stream's DLQ feature with the Kafka binder. When message processing fails, the message is sent to the DLQ as expected, however, I want to be able to modify the message being sent to the DLQ to include some extra diagnostic…
Boon
  • 1,073
  • 1
  • 16
  • 42
0
votes
1 answer

How to handle Serialization error in Spring cloud stream kafka streams binder?

I am writing a Kafka streams application using Spring cloud stream kafka streams binder. While the consumer publishes message to a output topic, there may be an error like Serialization error or Network error. In this code - @Bean public…
0
votes
1 answer

Strategy for maximum throughput having 6 Kafka Consumers when the processing of each message requires a long time

Consider this scenario: Kafka topic with 6 partitions. Spring Java Kafka Consumer Application with 6 replicas so that each of them deals with one of the partitions. The problem I'm facing is the processing of each message in the consumer takes a…
0
votes
1 answer

Spring Stream Listener Polling Stuck

We have Spring Stream Listener that operates in BATCH mode.The processing time for each batch is about 3 ms.Following is our configuration: allow.auto.create.topics = true auto.commit.interval.ms = 100 auto.offset.reset =…
0
votes
1 answer

Messages getting omitted in Spring Cloud Stream Kafka in Reactive

I have a spring cloud application, using spring reactive core listening to two topics and each has 10 partitions. In the consumer I am simply reading the message and printing the topic, partition and offset, Some messages are not getting read. I…