Questions tagged [spring-cloud-stream]

Spring Cloud Stream allows a user to develop and run messaging microservices using Spring Integration and run them locally, or in the cloud, or even on Spring Cloud Data Flow. Just add @EnableBinding and run your app as a Spring Boot app (single application context). You just need to connect to the physical broker for the bus, which is automatic if the relevant bus implementation is available on the classpath.

Use this tag for questions about the Spring Cloud Stream project. It is not intended for general questions about integrating other Spring projects with other technologies.

Spring Cloud Stream's Official Project Site

Spring Cloud Stream's Github Repo

How to contribute

Related tags , , .

2724 questions
0
votes
0 answers

The second record of the left side of a foreign key join of two KTables emits a tombstone record

We observed this quite worrying behaviour: The second record on the left side of a foreign key join emits a tombstone record Two records on the right side don't emit anything The first record on the left side of a foreign key join doesn't emit…
0
votes
0 answers

Is there a performance impact with spring cloud stream single binding with multiple destinations?

If I use functionRouter with Spring Cloud Stream Kafka Binders (SpringBoot 3.0.4, Spring Cloud version 2022.0.1), where the functionRouter consumes messages from multiple topics, is there a performance impact in comparison to using a separate…
0
votes
1 answer

kafka streaming application does not consume all the topics and does not throw error

I have written java springboot kafka stream processor application (Kubernetes 1 pod, scaled to 3 to see if that helps, the incoming messages increase after 12 AM) which has several input destinations and several output destinations, the application…
0
votes
0 answers

What is the spring.cloud.stream.function.reactive property for?

I use reactive environment and my message communication is successful without it. I found this property and I have not found any information from documentation or comment in the code. What is the goal of this property? I was already debugging this…
Numichi
  • 926
  • 6
  • 14
0
votes
0 answers

with spring cloud stream binding to pulsar the dead letter policy doesn't apply to the consumer

I am using spring cloud stream bindings to consume pulsar messages with. implementation("org.springframework.pulsar:spring-pulsar-spring-cloud-stream-binder:0.1.1-SNAPSHOT") this library is used. Following is the application.yaml configuration. #…
Noor Khan
  • 151
  • 1
  • 9
0
votes
1 answer

Stream cloud stream throws `MessageDispatchingException: Dispatcher has no subscribers` from time to time

I have a spring cloud stream application that consumes a Kafka topic and eventually update an ElasticSearch index. here is my code: @Bean public Consumer>> fetchSeed() { return messages -> messages …
0
votes
1 answer

Does StreamBridge.send() return value can be used as publishing confirmation?

I want to guarantee a message will be delivered to the broker, the classic case of publisher confirms, when using Spring Cloud Stream with RabbitMQ. When using the StreamBridge.send() method, the javadocs says the following about the return of the…
Rod
  • 161
  • 8
0
votes
1 answer

KStreams with Spring Cloud Stream (Kafka Binder) - Setting Partitions

I wrote a small KStreams Processor using Spring Cloud Stream - https://github.com/sandeep540/kafka-streams-spring3-cloud-java17 Here I am reading from Kafka Topic "input-topic" with 5 Partitions, processing it and sending it to another topic When I…
0
votes
1 answer

Functional Consumer and Producer with Spring Cloud Stream

In my springboot application I need to consume message from 2 different kafka topics. From the first topic all messages follow the same elaboration. From the second topic I need to do 2 different elaboration by an header value. In the past I used…
0
votes
0 answers

Headers in GET request using Supplier<> in spring cloud function

Like using Message<> we get the headers in Function<>, Is it possible to get Headers inside Supplier<> in spring cloud function?
0
votes
1 answer

add custom header when message publish to DLQ by spring cloud stream Kafka binder

Background: when consume message failed and publish to DLQ, I would like to add the custom header , the header key is exception and value is the exception this is my configuration spring: cloud: function: definition: numberConsumer …
0
votes
0 answers

How can I send metrics to to a kinesis destination?

I am using spring cloud stream with the aws kinesis binder. I have set the following : spring.cloud.stream.bindings.applicationMetrics.destination=metrics-stream spring.cloud.stream.metrics.properties=spring.application**…
0
votes
0 answers

Spring Cloud Stream Kafka, how to set producer properties at channel level

How to set acks = 1, max.in.flight.requests.per.connection = 10 properties for a Kafka publisher at channel level in spring cloud stream Kafka config. I have tried the below ways to set both properties but it isn't…
Mike Reddington
  • 192
  • 2
  • 15
0
votes
0 answers

No error logging when dead letter queue enabled

Whenever there is an error while processing incoming messages the failed messages are moved to the Dead letter queue. However, the exception that caused the message is not logged. spring cloud stream : 4.0.1 Binder : Kafka spring: cloud: …
HashDhi
  • 27
  • 4
0
votes
1 answer

spring cloud stream kafka binder can not publish and consume message by functional programming

I try to use functional programming way to produce and consume message the spring-cloud-stream version is 3.2.7 this is my application,yml spring: cloud: function: definition: numberProducer,numberConsumer stream: bindings: …