Questions tagged [spring-cloud-stream]

Spring Cloud Stream allows a user to develop and run messaging microservices using Spring Integration and run them locally, or in the cloud, or even on Spring Cloud Data Flow. Just add @EnableBinding and run your app as a Spring Boot app (single application context). You just need to connect to the physical broker for the bus, which is automatic if the relevant bus implementation is available on the classpath.

Use this tag for questions about the Spring Cloud Stream project. It is not intended for general questions about integrating other Spring projects with other technologies.

Spring Cloud Stream's Official Project Site

Spring Cloud Stream's Github Repo

How to contribute

Related tags , , .

2724 questions
0
votes
1 answer

Is there any way to get notified or call another service once file is received at sftp-sink

Is there any way to get notified or call another service once file is written to an sftp sink, instead of creating another stream listening to the output location of the first stream? I have a stream with sftp sink at the end of the stream. I need…
Venu Gopal
  • 15
  • 1
  • 7
0
votes
1 answer

Can we use share point directory as destination to my file sink?

We have a spring cloud data flow stream which reads an excel from an SFTP source location and processes it and generates an output excel which is to be copied to a common share point folder. Is there any way we can do this...? We could write to sftp…
0
votes
0 answers

How to passing the enviroments from Docker to spring boot for Map type

I'm running my image that builds on the spring boot project using docker. I specify the environment variables by --env-file. I have configured in my project with application.yml file. The configuration related to the BindingServiceProperties class…
Truong Huy
  • 55
  • 1
  • 8
0
votes
0 answers

Stream sinks are not constructed when application starts up before Kafka broker

My team is building a set of services that use Kafka Connect and Debezium to forward data changes from our Postgres database to Kafka, and then use Kafka Streams (via Spring Cloud Stream) to process this data and build an aggregate of the source…
0
votes
0 answers

Spring Cloud Stream Kafka binder only consider kafka-defined properties when merging default options to producer/consumer

In KafkaBinderConfigurationProperties, there are default properties, producer and consumer specific properties. And for the API mergedConsumerConfiguration and mergedProducerConfiguration, when merging those properties, these two APIs only consider…
0
votes
1 answer

How does routing in SCS Kafka work when it comes when defines two binders for a single topic vs piping multiple consumers with a single binder

I have multiple consumers that must listen to a single Kafka topic. I've found there are two ways to do it: Use a binder in config per consumer Bean and specify a route to manage it via routing condition (using Spel) Use only a single binder and…
0
votes
1 answer

Spring Cloud Dataflow Cloudfoundry - Unversioned App Name

Is there a way to get unversioned app name in SCDF 2.9.X using skipper 2.8.x in CloudFoundry? With the current deployment in PCF, everytime we update/re-deploy a stream Skipper adds a -v#. This while great for blue/green sort of deployment, brings…
0
votes
0 answers

How to join two streams in kafka?

I am trying to join two streams in Kafka like following code. @Bean public KTable stream(StreamsBuilder builder) { KeyValueBytesStoreSupplier store = Stores.persistentKeyValueStore("words"); …
Emil
  • 423
  • 1
  • 12
  • 34
0
votes
1 answer

Kafka Streams with Producer only binding

My application require a list of Records to be published to Kafka topic based on REST API request. Kafka Stream Binding doc I went throught the document and I cannot find an example of producer only binding. Is it possble using KafkaStream to…
0
votes
0 answers

Catch the exception and Message before pushing to dead letter queue in Spring cloud stream kafka

I have configure retries and dead letter queue in spring cloud stream kafka and all are working as expected. But here is a scenario when message failed and all retries done catch the exception only on last retry then set the failure reason in…
0
votes
1 answer

When rabbitMQ is down in spring boot, startup time increases due to connection retries

I have a spring boot micro service application and I am using spring-cloud-stream-binder-rabbit All my rabbiqMq configurations are working fine but if rabbitMq goes down, consumers keeps attempting to fetch the connection indefinitely increasing the…
0
votes
1 answer

KCL doesn't PUT or GET and items from DynamoDB checkpoint or locks tables

I implemented reading messages from Kinesis using KCL, but when i check DynamoDB tables, i found that only group table have the list of shards, but checkpoint table and locks table doesn't have any items: Any idea why KCL doesn't PUT or GET items…
0
votes
1 answer

Reactive Spring Cloud Stream with RabbitMQ binder: Manual ACK

So I have a reactive Consumer and RabbitMQ as a Binder implementation, it basically looks like this: @Override public void accept(Flux> eventMessages) { eventMessages .buffer(Duration.of(5,…
Dmytro
  • 1,850
  • 3
  • 14
  • 20
0
votes
0 answers

KCL Encountered an exception while renewing a lease

I implement reading from AWS/Kinesis stream using KCL, but after read many messages from shards, suddenly i see this error in server logs: ERROR 1 --- [oordinator-0000] c.a.s.kinesis.leases.impl.LeaseRenewer : Encountered an exception while…
0
votes
1 answer

consumption of events stopped after the consumer throw an exception in spring cloud stream?

I have an aggregation function that aggregates events published into output-channel. I have subscribed to the flux generated by the function like below: @Component public class EventsAggregator { @Autowired private Sinks.Many>…