I could not make spring-cloud-stream-binder-kafka work for following use case:
Start @transaction (Rest controller)
DB update/inserts
Send Kafka message
Before the transaction has committed, the consumer (configured with @EnableBinding and…
I'm trying to bring up simple pub-sub application using spring cloud kafka binder. However I'm unable to set Serializer, DeSerialzer property and other producer and consumer properties in application.yml. I consistently get…
I am using the Kafka Streams API in a Java application (Spring Cloud Stream). I have a particular use case, as follows:
My application will consume from topic A, and produce and consume to/from topic B.
For each message on topic A there is a set of…
I'm using KStream parameter in @StreamBuilder.
This will create a DefaultBinding through KStreamBinder.
My requirement is to use Binding visualization and control.
However
return new DefaultBinding <> (name, null, outboundBindTarget, null);
You…
I am using processor api to delete messages from state store. Delete is working successfully, i confirmed by using interactive queries call on state store by kafka key, but it does not reduce the kafka streams file size on local disk under directory…
I'm using Spring Cloud Stream library in a Java application. I want to use the Kafka Streams binder for a state store. The application will post messages to a topic, and I wish to use the Kafka Streams InteractiveQueryService to retrieve data from…
My application has three topics that receive some events belonging to users:
Event Type A -> Topic A
Event Type B -> Topic B
Event Type C -> Topic C
This would be an example of the flow of messages:
Message(user 1 - event A - 2020-01-03)…
Below is the code for branching, it streams to only one topic (the first one). As I understand, it should stream to all three topics?
Anyway I can stream to three topics using branch?
@Bean
public Function,…
We have a Kafka listener consuming messages from topic. We want to make this bean as functional so we can spin up multiple instances of function using server less architecture when there is heavy load. Can anyone show me a right direction
The application i am running, would behave as a consumer in kafka streams. I have kafka messaging configured via spring's stream.
How to figure out whether its consuming from the right topic?
Also whats is SINK in…
I have a code similar to the one below:
KafkaConsumer kafkaConsumer = new KafkaConsumer<>(properties);
partitions = Collections.singletonList(new TopicPartition(topic,…
Spring Cloud Stream Kafka, KTable as input not working
Sink.java
public interface EventSink {
@Input("inputTable")
KTable, ?> inputTable();
}
MessageReceiver.java
@EnableBinding(EventSink .class)
public class MessageReceiver {
…
I try to prepare our application for the next Spring Cloud Stream release. (currently using 3.0.0.RC1).
Using the Kafka Binder.
Right now we receive one message, process it and resend it to another topic. Handling each message separately results in…
I have a Kafka stream processing application written using Spring Boot, using spring-cloud-function and spring-cloud-stream-binder-kafka-streams. The method which processes a couple of streams is annotated with @Bean, so that it should be picked up…
I have to set a custom header in the kafka message, my kafka cluster natively supports headers (1.x.x). I am currently using springCloudVersion=Finchley.RELEASE.
When I am setting property
default:
producer:
headerMode: none
None of the…