0

This could be some kind of best practice question. Someone please who has worked on this clarify with examples. So that all of us could benefit!

For event-driven architectures with Kafka / Redis, when we create topics/streams for events, what are all the best practices to be followed.

Lets consider online order processing workflow.

I read some blogs saying that create topics/streams like order-created-events, order-deleted-events etc. But my question is how the order of the messages is guaranteed when we split this into multiple topics.

For ex:

order-created-events could have thousands of events and being slowly processed by a consumer. order-deleted-events could have only few records in the queue assuming only 5-10% would cancel the order.

Now, lets assume, an user first places an order. then he immediately cancels. This will make the order-deleted-event to process first as the topic/stream do not have much messages before some consumer processes order-created-event for the same order. It will cause some data inconsistency.

Hopefully my question is clear. So, how to come up with topics/streams design?

RamPrakash
  • 2,218
  • 3
  • 25
  • 54

1 Answers1

2

Kafka ensures sequencing for a particular partition only.

So, to take use of kafka partitioning and load balancing using partitions, multiple partitions for a single topic( like order) should be created.

Now, Use a partition class to generate a key for every message and that key should correspond to same partition only.

So , irrespective of Order A getting created , updated or deleted , they should always belong to same partition.

To properly achieve sequencing , this should be the basis of deciding topics , instead of 2 different topics for different activities.

GK7
  • 116
  • 3