I have a Kafka topic that has compaction enabled by setting cleanup.policy=compact.
My segment.bytes property is set to a slightly larger values (100Mb) so that my brokers are performing well.
If I have a Kafka streams application that is using the…
I am using Spring Cloud to consume a Kafka topic, do some processing and store the result in a Mongo DB. I noticed that if my consumer is slow in processing that the Memory consumption climbs rapidly until bringing the service down.
Further Analysis…
I'm currently using @StreamListener to inbound messages one-by-one and then process it in our service class to save it in db.
Instead, I want to inbound a list of messages (say 100) at a time and then process it to save all these 100 messages at…
Having a simple Spring Cloud Stream setup.
The interface
public interface MyKafkaBinding {
@Output(PUBLISHER)
MessageChannel publisher();
@Input("subscriber")
SubscribableChannel…
I read about the spring cloud stream 3.0 documents, to understand the new using java.util.function.[Supplier/Function/Consumer] to represent the producers, the consumption and production, consumers, and this should be correct.
But I don't understand…
I've been trying to get Spring Cloud Stream to work with Kafka Streams for a while now,
my project uses embedded kafka for testing with Kafka DSL and I used this repository as a base for my test implementation (it itself is a test cases for this…
In local Kafka messages are consuming successfully with the below application configuration
spring:
cloud:
stream:
kafka:
binder:
replicationFactor: 1
auto-create-topics: true
brokers:…
My question is similar to following question. I want to check health for Kstream application which is coded through functional approach.
Spring Actuator + Kafka Streams - Add kafka stream status to health check endpoint
In the above link answers are…
I am getting the following exception on my microservice since my last release. As the exception is not complete (elipise ...), I am not able to make sense out it.
As per my understanding kafka records can have null for key (as then RoundRobinStragey…
I am using spring-cloud-stream-binder-kafka and have implemented stateful retry using DefaultErrorHandler, I found that by enabling deliveryAttemptHeader of container properties I can access the retry count or deliveryAttempt count from message…
I've compiled my app containing a Kafka Streams stream with maven -Pnative native profile and start it with -Dspring.aot.enabled=true -Dspring.profiles.active=dev, but the start fails with the following error:
Caused by:…
I'm trying to figure out the poll records mechanism for Kafka over SCS in a K8s environment.
What is the recommended way to control max.poll.records?
How can I poll the defined value?
Is it possible to define it once for all channels and then…
StreamBridge cannot be injected (using @Autowired) into a service and causes the following error:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type…
I have a Kafka consumer that is implemented using Spring's Kafka Streams API. The consumer looks something like this:
@Bean
public Consumer> fooProcess() {
return input -> input
.foreach((key, value) -> {
…
I am using spring-cloud-stream-binder-kafka version 3.2.6 in my application. I have enabled the batch-mode so that I should get the messages in batches. While handling DLQ scenario I observed that, In my batch if I have 3 messages and if my last…