I want to check for duplicate messages in my Spring Cloud Stream Kafka consumers. This is what I have now:
@Component
public class MessageSubscriber {
@Autowired
MessageConsumer messageConsumer;
public Consumer<Message<PlaneEvent>> planeEventConsumer() {
return event -> messageConsumer.consumePlaneEvent(event);
}
@Component
public static class MessageConsumer {
@Transactional
@DuplicateCheck(key = "id")
public void consumePlaneEvent(Message<PlaneEvent> msg) {
// do something here
}
}
}
The custom @DuplicateCheck
annotation uses Spring AOP to intercept the consumer method and checks for duplicates in the database using the specified key
, and the kafka_receivedTopic
header from the Message
.
Is there a recommended pattern or better way to:
Intercept functional consumers. (I am aware about
ChannelInterceptor
, but would not prefer to use it because of multiple reasons. ChannelInterceptor intercepts all channels [input and output], and the regex to filter channels will need me to enforce a channel naming scheme. ChannelInterceptor will also complicate skipping the processing of duplicate messages. I'd like something less intrusive.)Get the kafka topic or binding destination in the functional consumer. Relying on the
kafka_receivedTopic
header seems brittle and inflexible.Handle transactions with functional consumers. Since the
@Transactional
annotation creates dynamic proxies, I'm having to wrap the consumer logic in another class and create its bean.
These things were a little simpler with the older annotation-based consumers (@StreamListener
).