0

We are building a reconciliation system(spring boot based application) which reads multiple events asynchronously from multiple Kafka topics for a given orderid. Let's say for orderid 123, we will have 7 different events like event1, event2, event3 till event7. There will be considerable delays (some events might take 2/3 weeks to get generated) in the events getting generated and there are no sequential orders followed like event1,event2 then event3 etc..

Expected traffic is around 1000 orders/second in production. Some of the event details (Not the entire order response as it's a huge one) for any given orderid will be saved in MongoDB. There is also a possibility of increase in the number of events in the future. Given this scenario, should we go for batch insert/update for the given orderid every time a new event is generated or how the DB write operation should be designed for optimum performance?

Cheers, Ragggyy

0 Answers0