I am looking for instructions on the configuration of sasl.jaas.config when you have 2 separate topics, each having separate connection key? I am using spring-cloud-starter-stream-kafka version 3.1. I am not using…
I am writing a Kafka Producer and Consumer using spring cloud stream Kafka binder. I want access to the following information in both the producer and consumer
a) Topic
b) Partition
b) Offset
I did check the documentation, and am not really able to…
How can i send a message with the new Spring Cloud Stream Kafka Functional Model?
The deprecated way looked like this.
public interface OutputTopic {
@Output("output")
MessageChannel output();
}
@Autowired
OutputTopic outputTopic;
public…
A few examples for implementing HealthIndicator needs a KafkaTemplate. I don't actually manually create a KafkaTemplate but the HealthIndicator requires one. Is there a way to automatically grab the created KafkaTemplate (that uses the…
I have an application using Spring Cloud Stream Kafka.
For user defined topics I can delete records from specified topics by giving the configuration I mentioned below. But this configuration doesn't work for DLQ Topics.
For example in the…
In the .yaml file, we have set
spring.cloud.stream.kafka.binder.configuration.enable.idempotence as true.
Now when the application starts up, we can see a log like
[kafka-producer-network-thread | test_clientId]…
There's some native configurations like enableDlq within spring cloud stream kafka binder, but there's not any example's that i've found on how to properly unit test if the enableDlq is working properly. Are there any examples out there of how to…
I have a spring boot project using Kafka. I configured it with Spring Cloud Stream Kafka auto configuration. I want to create my topics automatically with 3 replicas and 1 day retention. For this I added replication factor and retention.ms to my…
We are currently developing a spring cloud stream application using kafka streams. (The problem seems to be not specific to spring cloud streams but to kafka streams)
Our processor needs to transform events from an incoming kstream and produces two…
I am using spring-cloud-stream-binder-kafka-streams:3.1.1 with functional programming. I have tried a number of combinations to set GroupId but the Consumer is always printing GroupId as the spring.application.name.
pom.xml
We are using Spring Cloud Stream 2.2 with the Kafka binder. Something we have noticed is if the pod is killed in the middle of doing the job for whatever reason, then we will miss the message to be sent to DLQ.
We are managing exceptions by catching…
Is it possible to use 2 functions where the output of the first function it the input of the second one?
My functions:
@Configuration
public class StringStream {
@Bean
public Supplier> stringSupplier() {
return () ->…
Is it possible to enable spring.cloud.stream.kafka.streams.binder.auto-create-topics in the application.yml only for topics that the application produces and not auto-create topics that the application consumes? I'd like for my producers to only be…
I have problem with StreamListener initializing. I can not solve my problem.
I am using Spring Cloud Stream Kafka and Spring Cache in my project. Spring Cache is initialized after SmartLifeCycle's start() method. But StreamListener starts consuming…
I want to inject a bean into the customized ConsumerInterceptor as ConsumerConfigCustomizer is added in Sprint Cloud Stream 3.0.9.RELEASE. However, the injected bean is always NULL.
Foo (The dependency to be injected into…