0

Currently I use the following event producer:

@Component
class GenericEventProducer : EventProducer<BaseEvent<*>> {
    override val sink: Sinks.Many<Message<BaseEvent<*>>> = Sinks.many().replay().latest()

    @Bean
    fun projekte() = Supplier<Flux<Message<BaseEvent<*>>>> { sink.asFlux().filter { it.payload is ProjektEvent } }
}

The EventProducer is a wrapper with a default method around sink.tryEmitNext(message). The advantage is, that I can use the spring cloud stream bindings and producer configuration.

Now that the reactive Kafka binder matured into 1.x raises naturally the question for me whether it could and should be used instead of this sink-based solution. I couldn't find proper examples on its usage and it generally lacks documentation on ReactiveKafkaProducerTemplate.

Andras Hatvani
  • 4,346
  • 4
  • 29
  • 45
  • The dos for reactive Kafka Binder is here: https://docs.spring.io/spring-cloud-stream/docs/current/reference/html/spring-cloud-stream-binder-kafka.html#_reactive_kafka_binder. I see you have `Supplier` bean in your configuration. That should be enough to be able to bind it into a destination and respective binder on classpath will handle it for you. Not sure why your worry about `ReactiveKafkaProducerTemplate` since it is already a low-level API outside of Spring Cloud Stream scope for end-users. – Artem Bilan Jul 07 '23 at 14:15
  • And here is a `Supplier` doc explaining that `Flux` requirement: https://docs.spring.io/spring-cloud-stream/docs/current/reference/html/spring-cloud-stream.html#_suppliers_sources – Artem Bilan Jul 07 '23 at 14:19

0 Answers0