I am wondering if there is a way to enable Stateful RetryTemplate using spring-cloud-stream-binder-kafka.
I noticed that there is a constructor
RetryingMessageListenerAdapter(MessageListener messageListener, RetryTemplate retryTemplate,…
The main purpose is to read a stream from a topic, apply some transformations and then send two events to other topics. For that we are using Kstream.branch() function and using functional style programming. The code is:
Input…
I'm struggling to understand how I should go about testing an application that makes use of Kafka Binder while also using Spring Cloud function.
Let's use this very simple example:
@SpringBootApplication
public class DemoKafkaApplication {
public…
I am working on Spring Cloud Stream Apache Kafka example. I am developing code taking reference from : https://www.youtube.com/watch?v=YPDzcmqwCNo.
org.springframework.messaging.MessageDeliveryException: failed to send Message to channel 'pvout';…
I am using spring-cloud-stream with kafka binder to consume message from kafka . The application is basically consuming messages from kafka and updating a database.
There are scenarios when DB is down (which might last for hours) or some other…
We have scenario where our application(spring boot, spring-cloud-stream based) listens to multiple Kafka topics (TOPIC_A with 3 partitions, TOPIC_B with 1 partition,TOPIC_C with 10 partitions) i.e. 3 @StreamListener methods.
…
I am trying to implement Kafka consumer and Kafka producer within same Spring boot application using spring cloud and binder. Both run successfully if executed separately, but if executed together only Kafka Producer is able to connect successfully…
i want to setup some transformation (consume from one topic, produce to another) using spring cloud stream. also i want it to be
reliable - lets say 'at least once'
fast produce - perform producer.flush() once per batch, not per message
fast…
I have a spring boot kafka stream application with 2 topics consider topics A and B.Topic A has 16 partition and Topic B has 1 partition.Consider the application is deployed in 1 instance having num.stream.threads=16.
I ran kafka-consumer-groups.bat…
In our application we have multiple topics where some topics will be created with 16 partition and some topics will be created with 1 partition. Is there any spring.cloud.stream.kafka.bindings property/option available to achieve this?
I have an event listener Spring boot application which performs the operation Read from Azure Event Hub Topic -> Persist the event into DB.
I used the spring-cloud-azure-eventhubs-stream-binder(Version - 1.2.1)'s Sink to listen for events from my…
I have a working application that uses the latest update for Producers that came with Hoxton. Now I'm trying to add some integration tests, asserting that the Producer is actually producing a message as expected. The problem is, the consumer I use…
I am writing the tests for a Spring Cloud Stream application. This has a KStream reading from topicA. In the test I use a KafkaTemplate to publish the messages and wait for the KStream logs to show up.
The tests throw the following…
I have the following functional style topology:
1) KStream input
2) filter
3) flapMap
4) aggregate
5) filter
6) map
7) KStream output
The input KStream comes with sleuth headers, but they get lost in the kafka stream.
FlapMap split one record into…
Trying to consume kafka messages in batch mode using Spring Cloud Stream 3.0.
Consumer receives a list containing single record, instead of more.
below is the yml , consumer coded used
spring:
cloud:
stream:
bindings:
…