This could be some kind of best practice question. Someone please who has worked on this clarify with examples. So that all of us could benefit!
For event-driven architectures with Kafka / Redis, when we create topics/streams for events, what are all the best practices to be followed.
Lets consider online order processing workflow.
I read some blogs saying that create topics/streams like order-created-events
, order-deleted-events
etc. But my question is how the order of the messages is guaranteed when we split this into multiple topics.
For ex:
order-created-events could have thousands of events and being slowly processed by a consumer. order-deleted-events could have only few records in the queue assuming only 5-10% would cancel the order.
Now, lets assume, an user first places an order. then he immediately cancels. This will make the order-deleted-event to process first as the topic/stream do not have much messages before some consumer processes order-created-event for the same order. It will cause some data inconsistency.
Hopefully my question is clear. So, how to come up with topics/streams design?