0

I'm trying to do send data to kafka after my database operation is successful.

I have a /POST endpoint which store the data in mongodb and return the whole object along with mongoDB uuid.

Now I want to perform an addition task, if data is successfully saved in mongodb i should call my kafka producer method and send the data.

Not sure how to do it.

Current Codebase

public Mono<?> createStock(StockDTO stockDTONBody) {
    // logger.info("Received StockDTO body: {}, ", stockDTONBody);
    
    Mono<StockDTO> stockDTO = mongoTemplate.save(stockDTONBody);

   // HERE I WANT TO SEND TO KAFKA IF DATA IS SAVED TO MONGO.

    return stockDTO;
}
  • 3
    In case your kafka producer is reactive just use one of the reactor operators to continue the flow (e.g. ‘`flatMap`). – Alex Jun 06 '22 at 14:55

2 Answers2

0

Thanks @Alex for help. I

Adding my answer for others.

public Mono<?> createStock(StockDTO stockDTONBody) {
     // logger.info("Received StockDTO body: {}, ", stockDTONBody);

     Mono<StockDTO> stockDTO = mongoTemplate.save(stockDTONBody);


     // =============== Kafka Code added======================
     return stockDTO.flatMap(data -> sendToKafka(data, "create"));
    }

public Mono<?> sendToKafka(StockDTO stockDTO, String eventName){

 Map<String, Object> data = new HashMap<String, Object>();

    data.put("event", eventName);
    data.put("campaign", stockDTO);
    template.send(kafkaTopicName, data.toString()).log().subscribe();

    System.out.println("sending to Kafka "+ eventName + data.toString());
    
    return Mono.just(stockDTO);
}
0

This can result in dual writes if your data is saved in mongo and something goes wrong while publishing to kafka. Data will be missing in kafka. Instead you should use change data capture for this. Mongo provides mongo change streams which can be used here or there are other open source kafka connectors available where you can configure the connectors to listen to changelogs of Mongo and stream those to kafka.

  • I have thought about CDC as well but i have never work on it. I don't think we have power to do manipulation on the data when implementing CDC approach, what I'm doing is saving data to mongo and then trying to send it to kafka, if kafka is down or something bad has happened then I'll delete the record from mongodb and tell the client that something bad has happened. I can do this because we know the range of ack time from out kafka cluster. Not sure if this is good approch or not. – Prakitidev Verma Dec 05 '22 at 14:35