1

Trying to persist some data in MongoDB and publish a couple of messages to Kafka everything inside of a transaction. I was thinking of using the dotnet TransactionScope but I do not see how to apply it to the Confluent Kafka packages.


        var _database = client.GetDatabase(mongoUrl.DatabaseName);

        using(var transactionScope = new TransactionScope());
        {
          var drinkCollections = _database.GetCollection<Drinks>("drinks").AsTransactionCollection();
          var orders = _database.GetCollection<Orders>("orders").AsTransactionCollection();

          _kafkaProducer.ProduceAsync("commands", "orderID", new OrderCommand() { Waiter = "John", Price = 3.35m });

          transactionScope.Complete();
      }

Obviously if the ProduceAsync method fails to publish to Kafka then it should rollback or cancel the persistence to MongoDb.

Not sure how to implement the transaction. Any ideas would be appreciated.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
101000101
  • 23
  • 3
  • 1
    Dont do two-phase commits. Ideally you would use Debezium to pull data out of Mongo and into a Kafka topic rather than need to query, then produce. That way, the data is guaranteed to at least be in Mongo... Or you can produce directly to Kafka (with transactional producer) and use Mongo Sink connector to write to the database from Kafka. – OneCricketeer Jun 23 '22 at 18:37

0 Answers0