Questions tagged [spring-cloud-stream]

Spring Cloud Stream allows a user to develop and run messaging microservices using Spring Integration and run them locally, or in the cloud, or even on Spring Cloud Data Flow. Just add @EnableBinding and run your app as a Spring Boot app (single application context). You just need to connect to the physical broker for the bus, which is automatic if the relevant bus implementation is available on the classpath.

Use this tag for questions about the Spring Cloud Stream project. It is not intended for general questions about integrating other Spring projects with other technologies.

Spring Cloud Stream's Official Project Site

Spring Cloud Stream's Github Repo

How to contribute

Related tags , , .

2724 questions
0
votes
1 answer

What is the expected behavoir when StreamBridge.send method returns false?

I notice a lot of usage of the StreamBridge.send method in Spring Cloud Stream apps but none of them check its return value. What is the expected behavior when the send method returns false? Should we do the retry if it returns…
0
votes
1 answer

Is it possible to create a multi-binder binding with Spring-Cloud-Streams Kafka-Streams to stream from a cluster A and produce to cluster B

I want to create a Kafka-Streams application with Spring-Cloud-Streams which integrates 2 different Kafka Clusters / setups. I tried to implement it using multi-binder configurations as mentioned in the documentation and similar to the examples…
0
votes
1 answer

I don't see the concurrency property in Spring docs

I'm migrating my application from Spring Boot 2.4 to Spring Boot 2.7, and Spring Cloud to 2021.0.3. I came across this situation. I am now using spring-cloud-stream package version 3.2.3, looking at the documentation here, it has no reference to the…
0
votes
1 answer

How can I get anonymous group id of Spring Cloud Stream?

I am using Spring Cloud Stream in my project. I did not give group id to my consumers due to my need. That's why group id's are given automatically(anonoymous group id) by spring cloud stream. But I need to use this group id in my runtime. Is there…
omerstack
  • 535
  • 9
  • 23
0
votes
0 answers

spring-cloud-stream upgrade to higher version not working

I am upgrading my spring boot project to a higher version. spring parent - 2.2.1 Release to 2.5.5 spring cloud - Greenwich.RELEASE to 2020.0.3 spring cloud stream - to 3.1.3 When I am upgrading my cloud stream to 3.1.3, I am getting the below error…
Ninjatech
  • 21
  • 1
0
votes
0 answers

Consumer not tiggering even after solving exception in consumer, its tiggering after backofftime

I have implemented consumer retry for exception/failure. Consumer not triggering until backoff time without exception. I have two questions here Consumer triggered very first within milliseconds, when exception occurred it retrying after (10 min)…
mhvb
  • 31
  • 4
0
votes
1 answer

Max retry attempts and back off interval not changing

I am trying to retry consumer on exception in my code. my consumer looks like below @Bean public Consumer> input(){ return -> { String output = service.getValues(); }; } below are the ways i have…
mhvb
  • 31
  • 4
0
votes
1 answer

Is it possible, on exception retry the consumer after 10 min and this should happen for 4 times

I have implemented the kafka consumer as follows @Bean public Consumer> input(){ return -> { // external service call }; } if any exception occurs in external call or in consumer, I want to retry consumer after 10 min. can any one…
0
votes
1 answer

How to move to functional programming model to publish to Kafka in Spring cloud

I am trying to move away from now deprecated annotations like @EnableBinding and @Output but could not find a simple example to do it in a functional way. These are the files currently: KafkaConfig.java @Configuration @EnableBinding({ …
Manoj Suthar
  • 1,415
  • 3
  • 19
  • 41
0
votes
1 answer

How do you pass Kinesis headers to Spring Cloud Stream functions in batch consumer mode?

I am in the process of converting our existing stream processing code that currently uses the annotation-based programming model to use Spring Cloud Function instead. With consumer batch mode enabled, how can I pass the AWS Kinesis checkpointer to…
0
votes
1 answer

Channels not mapped when testing with Spring Cloud Stream using Spring Cloud Function

I am in the process of converting our existing stream processing code that currently uses the annotation-based programming model to use Spring Cloud Function instead. Using the test binder described here, I am only able to successfully execute tests…
Keith Bennett
  • 733
  • 11
  • 25
0
votes
1 answer

Spring Cloud Integration test, embedded kafka vs testcontainers

I have a Spring cloud stream application which I need to make an integration test for (to be specific using cucumber). The application communicate with other services using Kafka message broker. From what I know I could make this work using either a…
0
votes
1 answer

Dynamic Kafka Consumer using functional paradigm

I have a particulier requirement in which I want to collect messages from a topic until a specified duration (for example for 40 seconds), but only when asked to (so start a consumer for 40 sec when asked for and then stop). I came across examples…
0
votes
2 answers

How to configure DeadLetterPublisherRecoverer to send error messages to a DLQ in Spring Cloud Stream batch mode

I have created a Kafka Consumer using Spring Cloud Stream and Spring Cloud Function for consuming messages in batch mode from a Kafka topic. Now, I want to send the error batches to a Dead Letter Queue for further debugging of the error. I am…
0
votes
1 answer

Error handling in Spring Cloud Stream Kafka in Batch mode

I am using Spring Cloud Stream and Kafka Binder to consume messages in batches from a Kafka Topic. I am trying to implement an error handling mechanism. As per my understanding I can't use Spring Cloud Stream's enableDLQ property in batch mode. I…