0

My producer is apache Kafka and we want to listen batch of events to process them and write that processed events into the database. If I use stream/batch every event will hit one query to DB. I don't want to hit every event as one query. How can I batch some of the events and write this bulk data into DB?

Note: We are using DataStream API

1 Answers1

0

No, there isn't an official Neo4j sink for Flink. If your goal is to implement exactly once end-to-end by doing buffered, batched transactional updates, you might start by reading An Overview of End-to-End Exactly-Once Processing in Apache Flink, and then reach out to the flink user mailing list for further guidance.

David Anderson
  • 39,434
  • 4
  • 33
  • 60