I am using Spring Cloud Stream Kafka Binder to produce message into Kafka. I have kept producer sync to false.
spring.cloud.stream.kafka.bindings..producer.sync: false
I faced one issue while producing asynchronously to Kafka, some…
The goal
I must setup a group id for the kafka stream consumer, that matches a strict naming convention.
I cannot find a way that works after having followed deeply the documentation. As I still believe that I may have misundersood something, I…
I've run into a problem where I need to repartition an existing topic (source) to a new topic (target) with a higher number of partitions (a multiple of the number of previous partitions).
The source topic was written to using Spring Cloud Stream…
I put @ServiceActivator to log all messages that gave error to kafka:
@ServiceActivator(inputChannel = "errorChannel")
public void handleErrors(final ErrorMessage in) {
log.error("encountered exception" + em.toString());
}
and I'm also setting…
I have spring cloud stream application with kafka binder that consumes and send messages.
In application i configure custom error handler with retry policy, and add not retryable exception to handler. Configuration exaple:
@Bean
public…
In our project we are using the spring-cloud-starter-stream-kafka for consuming the messages from Kafka, after adding the @ComponentScan for "org.springframework.cloud.stream.binder.kafka" package, the integration tests started failing with the…
The manual describes well how to consume simple messages using java.util.function.Consumer and make a through processing in a single processor method with java.util.function.Function, but it isn't clear the way to produce a message…
With a StreamBridge I send messages with two different types of objects to a single Kafka topic.
Is there a way to define a functional consumer with Spring Cloud Stream capable of consuming both types of messages?
I'm not sure how to title this issue. But we have a Kafka producer project which has this warning show up on the logs.
o.a.k.clients.producer.ProducerConfig : The configuration 'internal.auto.downgrade.txn.commit' was supplied but isn't a known…
I noticed several customizers in the KafkaBinderConfiguration, which we can pass to customize the binder. But in the case of multiple binders, the customizer beans won't be picked up. It may relate to this line in DefaultBinderFactory:
boolean…
we are using spring-kafka, and for non-spring apps that communicate with us that don't set the spring_json_header_types header, we are specifying that certain headers should be mapped in as Strings by adding them as rawMappedHeaders. We do this for…
I have multiple APIs that talk with each other through kafka (produce and consume messages). In one of the APIs I produce messages based on an HTTP request trigger (when an endpoint is called, a meesage is produced and sent to kafka) with @Output…
I'm trying to run my Kafka streams application under the "Asia/Istanbul" timezone. I'm not talking about the Timestamp extractor feature of Kafka streams. It means that I want to use windowing at different timezone. And after processing has been…
We're testing the use of Kafka Streams via Spring Cloud Stream function support with Avro input/output records, but setting nativeEncoding=false and nativeDecoding=false in order to use a custom MessageConverter where we do the Avro conversion.
The…
We develop an internal company framework on top of Spring Boot and we'd like to support Kafka-Streams with Spring Cloud Stream. We need to automagically inject some headers to all outbound messages. We've achieved this with standard Spring Cloud…