Questions tagged [spring-cloud-stream]

Spring Cloud Stream allows a user to develop and run messaging microservices using Spring Integration and run them locally, or in the cloud, or even on Spring Cloud Data Flow. Just add @EnableBinding and run your app as a Spring Boot app (single application context). You just need to connect to the physical broker for the bus, which is automatic if the relevant bus implementation is available on the classpath.

Use this tag for questions about the Spring Cloud Stream project. It is not intended for general questions about integrating other Spring projects with other technologies.

Spring Cloud Stream's Official Project Site

Spring Cloud Stream's Github Repo

How to contribute

Related tags , , .

2724 questions
0
votes
1 answer

How to Dynamically Create ' N' no of Queues in RabbitMQ Spring Cloud Stream

I am trying to create dynamic n Number of Queues using StreamBridge SpringBoot with RabbitMQ as broker I previous created queue using this…
0
votes
0 answers

KCL reader AWS instances goes down quickly due to heap memory exception

I wrote next implementation to read messages from AWS Kinesis and deployed it on AWS ECS: Application.java import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import…
0
votes
0 answers

Spring cloud : 2021.0.4 : Cloud Stream Partitioning is failing for RabbitMQ

In Cloud Stream "2021.0.4", we are facing issue: Routing to destination is failing when Partitioning is enabled for the RabbitMQ. Binder always throwing "Partition key can't be null". This does not occur when we rollback to 2021.0.3. Please note…
0
votes
0 answers

One-to-Many Processor streaming with JPA

I have a SCDF processor which produces multiple outputs per input. The list of outputs could be rather large. That's why I was thinking of streaming the results from JPA repository. @Service public class ProcessorFunction implements Function
Juzer Ali
  • 4,109
  • 3
  • 35
  • 62
0
votes
1 answer

is it possible to publish a message to multiple exchanges from a single app

I have a stream, wherein there is a source->processor1->processor2->Sink1 I need to push the output of processor1 to another Sink instance (i.e. Sink2). I need something like ______Processor2-----Sink1 …
0
votes
0 answers

Versioning in Spring Cloud Dataflow stream apps

Is there a way to deploy Spring Cloud Dataflow stream apps without versioning in CloudFoundry? Whenever we deploy a stream using SCDF 2.9.x that uses Skipper server, it adds a version number to the app deployed in CF. For example…
0
votes
1 answer

Spring boot clouyd stream with different input and output type

IN Spring boot Kafka STream I have following KStream: Function, KStream> process() { } So here input is of InputType object and output content is OutputType object. For this I want to write a custom…
Manish Kumar
  • 10,214
  • 25
  • 77
  • 147
0
votes
1 answer

Spring Cloud Stream + Kafka Binder: What is the default partition key extractor strategy or partition key?

I'm looking to implement a scenario where consumer order does not matter, and want to publish to multiple partitions. In this scenario, what would the strategy be used to select a partition if partition-key-expression is not specified in the…
Funsaized
  • 1,972
  • 4
  • 21
  • 41
0
votes
0 answers

StreamsException: ClassCastException invoking processor using Spring Cloud Stream and Kafka Streams Binder

I'm new to Spring Cloud Stream and Kafka Streams. I trying to building new Kstreams with mapped keys of input streams using Spring Cloud Stream and Kafka (Streams) binder. During the deserialization of the joined streams an error occurred (see log…
0
votes
0 answers

Kafka Stream with multiple consumers / processors not persisting data concurrently

I'm new with Kafka and want to persist data from kafka topics to database tables (each topic flow to a specific table). I know Kafka connect exists and can be used to achieve this but there are reasons why this approach is preferred. Unfortunately…
0
votes
0 answers

unable to autowire KafkaBinderConfigurationProperties bean in Spring cloud stream

i'm using spring cloud steam kafka and trying to implement a custom error handling by creating a bean of ListenerContainerCustomizer. the problem is to create this bean I need KafkaBinderConfigurationProperties bean which is created inside the…
Elia Rohana
  • 326
  • 3
  • 16
0
votes
0 answers

How to enable automatic Dead letter exchange/topic creation

I am using spring cloud stream with functional beans (supplier,consumer,function) and I am interested in setting up an auxiliary regular listener of my choice (kafka/rabbit) for processing dead letter messages. i have the below properties set which…
Dave Ankin
  • 1,060
  • 2
  • 9
  • 20
0
votes
1 answer

Deserialize BSON ChangeStreamDocument in a Kstream

I just configured a Mongo ChangeStream to push a message with full document in a kafka topic each time a document is modified, and then I would like to transform these messages and push them in a different topic. To do that, I use a Kafka stream (a…
0
votes
0 answers

Consumer issues Spring Cloud Rabbit Stream - Cloud Foundry

I have a spring cloud dataflow stream deployed in PCF using rabbit as the binder. I have multiple processors in the pipeline. Occasionally I see issues wherein a partitioned consumer does not consume messages from Rabbit until the consumer is…
0
votes
1 answer

Batch Consumer not working with Kafka for CloudEvents using Spring Cloud Stream

Trying to read the batch messages for CloudEvents from cloud stream with Kafka binder. If I use any custom class with custom serializer/deserializer it is working fine but with cloudevents the messages are not coming. spring cloud: …