0

I have use case where for every second 5k messages are sent to a kafka topic,On the Kafka consumer side I need to aggregate all message for that hour and write files hourly.We are just getting started in Kafka Streams and wondering whether this use case to aggregate message hourly is a right fit for Kafka streams. Thanks!

Bartosz Wardziński
  • 6,185
  • 1
  • 19
  • 30
Leo Prince
  • 940
  • 2
  • 14
  • 33
  • 1
    It's certainly possible, yes. What issues are you having? – OneCricketeer Feb 08 '19 at 01:24
  • 1
    you could do that with Kafka Streams by implementing `Transformer` with `processorContext.schedule(..)`. please take a look at example (your case is simpler than in example, as you need only write data to file inside punctuator): https://stackoverflow.com/questions/53628143/how-to-process-a-kstream-in-a-batch-of-max-size-or-fallback-to-a-time-window/53696566#53696566 – Vasyl Sarzhynskyi Feb 08 '19 at 16:09
  • Thanks for your comments @cricket_007 and Vasiliy. I will try and keep you posted. – Leo Prince Feb 09 '19 at 15:03

0 Answers0