Questions tagged [spring-cloud-stream]

Spring Cloud Stream allows a user to develop and run messaging microservices using Spring Integration and run them locally, or in the cloud, or even on Spring Cloud Data Flow. Just add @EnableBinding and run your app as a Spring Boot app (single application context). You just need to connect to the physical broker for the bus, which is automatic if the relevant bus implementation is available on the classpath.

Use this tag for questions about the Spring Cloud Stream project. It is not intended for general questions about integrating other Spring projects with other technologies.

Spring Cloud Stream's Official Project Site

Spring Cloud Stream's Github Repo

How to contribute

Related tags , , .

2724 questions
0
votes
0 answers

How to test spring cloud stream kafka binder

This is my code and I am looking to test it. Been looking at a few things (TopologyTestDriver, etc.) but could not get any to work. This example is directly from…
Papi Abi
  • 173
  • 1
  • 10
0
votes
0 answers

how to use Aggregate function in kstream with Arraylist of Avro Object

class java.util.ArrayList cannot be cast to class org.apache.avro.specific.SpecificRecord (java.util.ArrayList is in module java.base of loader 'bootstrap'; org.apache.avro.specific.SpecificRecord is in unnamed module of loader 'app') another…
0
votes
1 answer

Start Binding in Paused state

I would like to start my SCS application with the bindings not consuming messages. I use the Kafka binder. Is there any configuration for this? Alternatively I could use the BindingsLifecycleController to programatically pause the bindings. How do I…
0
votes
0 answers

Service crashing due to RabbitMQ Not found exception

Recently one of our services have started crashing intermittently, while it is running, with below errors - Error on startup - Channel shutdown: channel error; protocol method: #method(reply-code=406, reply-text=PRECONDITION_FAILED -…
0
votes
0 answers

Spring cloud stream kafka stream Multiple Input Bindings no outputs

I have a simple kafka stream app that process input from 3 topics but no out is required since the final step is to save the processing outcome to db. I saw this example for multiple inputs topics and that exactly what I need apart from the output…
user1409534
  • 2,140
  • 4
  • 27
  • 33
0
votes
1 answer

How do I acknowledge / requeue with cloud stream sqs binders

I am writing an application to consume messages from queue. I am able to successfully bind the sqs and receive the messages. However, when I want to requeue the message, I am using as follows. message.getHeaders().get(AwsHeaders.ACKNOWLEDGMENT,…
0
votes
0 answers

java.util.ArrayList cannot be cast to org.springframework.amqp.core.Message

I used spring-cloud-stream & spring-cloud-stream-binder-rabbit that's version is 3.2.4. When I used deadLetterQueue it work on well.When I used batchMode it work on well too. But when I used them together there were some Exception came out. my…
0
votes
1 answer

Spring Cloud Stream Rabbit Binder shared concurrency for all consumers

I am trying to integrate my application using Spring Cloud Stream RabbitMQ binder. I have 2 applications. Producer, And Consumer. In Consumer application there are 2 input channels, reading different Message Payload Types. I know that we can define…
user725455
  • 465
  • 10
  • 36
0
votes
0 answers

How to manual confirm (ack) message at consumer side in spring cloud stream 3.2.4

My code, its use function programing model @Bean public Consumer> test() { return msg -> { // how to confirm ack manually }; }
Aaron Wang
  • 11
  • 2
0
votes
1 answer

Test stream bridge fails when the output binding destination is the same as the input destination

I have the following configuration bindings: receiveEvents-in-0: destination: internal group: a-certain-group sendMessages-out-0: destination: internal rabbit: bindings: receiveEvents-in-0: consumer: …
0
votes
1 answer

Kafka version: 3.0.1- kafka admin clients created repeatedly - Memory leak

We have springboot app that consumes from a single topic and produces records to multiple topics. Recently upgraded this app to Sprinboot-2.6.7 and other dependencies accordingly in gradle project. App is able to consume & produce correctly, BUT the…
0
votes
1 answer

Transaction in spring cloud stream with data base operations

I use spring cloud kafka stream to do some processing to the stream save in the middle of the topology some data to the db and then continue the process in my topology and finally send final result to other topic My Function bean looks something…
user1409534
  • 2,140
  • 4
  • 27
  • 33
0
votes
0 answers

Non-blocking exponential backoff retry mechanism for Spring Cloud with RabbitMQ

I have a consumer: @Bean public Function>, Mono> myReactiveConsumer() { return flux -> flux.doOnNext(this::processMessage) .doOnError(this::isRepetableError, ?message -> sendToTimeoutQueue(message)?) …
0
votes
1 answer

Spring boot kafka with schema registry - payload is not matching at consumer end

I have a producer with this configuration kafka: bootstrap-servers: localhost:9092 cloud: stream: binder: consumer-properties: key.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer …
0
votes
0 answers

Spring Cloud Stream StreamsBuilderFactoryBeanCustomizer not replacing the ERROR thread

I am using Spring Cloud Stream with Kafka Stream binder. I want to alter the way deserilization exception and transient error are handled. To handle deserilization exceptions I…
Andy
  • 5,433
  • 6
  • 31
  • 38