Questions tagged [spring-cloud-stream]

Spring Cloud Stream allows a user to develop and run messaging microservices using Spring Integration and run them locally, or in the cloud, or even on Spring Cloud Data Flow. Just add @EnableBinding and run your app as a Spring Boot app (single application context). You just need to connect to the physical broker for the bus, which is automatic if the relevant bus implementation is available on the classpath.

Use this tag for questions about the Spring Cloud Stream project. It is not intended for general questions about integrating other Spring projects with other technologies.

Spring Cloud Stream's Official Project Site

Spring Cloud Stream's Github Repo

How to contribute

Related tags , , .

2724 questions
0
votes
0 answers

Manually create spring cloud stream bindings based on dynamic configuration

I have a requirement where one or more spring cloud stream kafka-streams bindings need to be created based on dynamic configuration. By dynamic config I mean stream bindings (input-output) will be specified run-time. Either via external property…
0
votes
1 answer

Restoration of GlobalKTables is extremely slow

Since we introduced GlobalKTables in streams in several services, the startup time of the services grew to unbearable amounts of time. We have a listener observing the state of state store restoration and this is what stands in the…
0
votes
1 answer

Spring Cloud Stream Binder Kafka - Dead Letter Topic in Different Cluster

Is there a plan to support configuring a dead letter topic in a different cluster? According to the below SO answer, the feature is not currently possible. Spring cloud Kafka Stream - Dead Letter Topic in Different Cluster I search the github…
0
votes
1 answer

Consume message from dead letter queue manually | Spring cloud stream kafka

I configured the default DLQ as following: spring: cloud: stream: kafka: bindings: input: consumer: enable-dlq: true dlq-name: dlq-topic dlq-partitions: 1 the…
0
votes
1 answer

Spring cloud stream, kafka binder - seek on demand

I use spring cloud stream with kafka. I have a topic X, with partition Y and consumer group Z. Spring boot starter parent 2.7.2, spring kafka version 2.8.8: @StreamListener("input-channel-name") public void processMessage(final DomainObject…
0
votes
0 answers

Why share state store across kafka streams is considered as warning by InteractiveQueryService?

I have Kafka 3 topics: 1st with actual temperature, 2nd with with actual air pressure and 3rd with requests. Request may be for actual temperature or for actual pressure. I need system for processing requests. Because joining could became too…
michaldo
  • 4,195
  • 1
  • 39
  • 65
0
votes
1 answer

How to set Spring Container properties at yml level?

I'm struggling to find in google/spring docs any way to make the Spring Container properties set in the yml file instead of programatically. I want to set the property "idleBetweenPolls" for one specific topic + consumer. I've achieved it…
Lucas
  • 129
  • 9
0
votes
0 answers

Kafka headers are not converted in batch-mode with Spring Cloud 3.0

My question is coming from this discussion: https://stackoverflow.com/a/74306116/3551820 I've added the proper configurations to enable batching consumption but Spring still doesn't work with batch converter to convert the headers of each message…
Lucas
  • 129
  • 9
0
votes
0 answers

Spring Cloud Stream Rabbit - Enabling error channel reduces throughput

In Spring Cloud Stream Rabbit is there a way to intercept acks in the background. Even though my intent is to only catch NACKS/Errors, since I am enabling publisher returns, it reduces the throughput. I am using a confirmACkChannel to intercept…
0
votes
1 answer

Spring cloud stream kafka. Want to send Message but Spring send Message. The payload is Byte[] not GenericMessage in JSON format
I use Spring cloud Stream with kafka, avro, schema registry. I work on reactive programmaing in Functionnal style. I want to produce a message like this. GenericMessage [payload={"id": "efb90cd6-e022-4d82-9898-6b78114cfb01", "type":…
0
votes
1 answer

Handling errors/exceptions in spring cloud data flow streams

One of the components in our stream is throwing heap out of memory error, when input file is larger than a certain limit. While we are working to fix this issue, I would like to know, if such errors can be caught so that we can log or send…
Venu Gopal
  • 15
  • 1
  • 7
0
votes
0 answers

Consume from Existing GCP pubsub Subscription

I am trying to bind existing subscription channel using com.google.cloud spring-cloud-gcp-pubsub-stream-binder And appl properties spring: cloud: functions: …
0
votes
1 answer

Getting QueuesNotAvailableException when setting spring.cloud.stream.rabbit.bindings..consumer.bindQueue=false

Following errors I am getting in a single stack trace. Want to know where did I go wrong. org.springframework.amqp.rabbit.listener.QueuesNotAvailableException: Cannot prepare queue for listener. Either the queue doesn't exist or the broker will not…
Ratikanta
  • 307
  • 6
  • 16
0
votes
1 answer

Spring Cloud Stream Binder Kafka

I'm trying to use Spring Cloud Stream with a Kafka binder to consume messages from a topic. Before I used annotations to create the consumer. Now I have use the functional approach, because the annotation is no more available. These are the…
0
votes
0 answers

Can Spring Cloud Stream bind to the same function multiple times?

Given the simple application: @SpringBootApplication public class StreamApp { @Bean Consumer> logSink() { return System.out::println; } public static void main(String[] args) { SpringApplication.run(StreamApp.class,…
Groater
  • 58
  • 6