0

I understand that event sourcing is all about storing events that represents changes in the state but not the state itself. so In my case, I get messages from Kafka and each message is encoded in JSON with 50 fields like this {key1: val1, key2: val2, .......key50: val50} and every message will have all or subset of these keys. Now my goal is to store these stream of messages as events in Cassandra and for me to store the changes in state I always need to know the current state to see the change in state caused by the next request but I wonder how this is done or more importantly how the data in the datastore would like?

user1870400
  • 6,028
  • 13
  • 54
  • 115
  • It depends on serializer. Nevertheless, you can always have some persistent actor with some variable which you can use to store your current state and compare it against next request. – Branislav Lazic Mar 15 '17 at 07:49
  • So if my current state A = `{key1: val1, key2: val2, .......key50: val50}` and the next state B = `{key1: val1, key2: val2, .......key5: val5}` then Persistence actor would store A-B = `{key6: val6, key7: val7, .......key50: val50}` ? – user1870400 Mar 15 '17 at 10:29

1 Answers1

1

An event in the datastore can look like this (stored in a sql db) :

  • id
  • uuid : eventSourcing aggregate ID
  • playhead : index of the number of event for this aggregate (eg: event number 5)
  • type : The name of the event (eg : CardWasCredited)
  • payload : All the data needed for this particular event (eg : For CardWasCreated event then you would have 'amount, card_number, comment...Etc'
  • metadata : Username...
  • recorded_on

Then those events are processed to create a read model which hold the current state (cqrs)

Manel
  • 1,616
  • 19
  • 42
  • Are you saying the payload `{key1: val1, key2: val2, .......key50: val50}` can be stored every time on every request? – user1870400 Mar 15 '17 at 10:27
  • If those values are needed for your event yes. But be careful to understand the whole eventsourcing flow : Command -> Event -> State. Maybe your Kafka messages are commands and should be transformed into one or more domain events (each with its own subset of 'keys') – Manel Mar 15 '17 at 10:28
  • My event is nothing but say inserted a row in the database. There is nothing really per say a business logic or computation. so I would end up storing event name and the payload `{key1: val1, key2: val2, .......key50: val50}` every time? – user1870400 Mar 15 '17 at 10:33
  • 1
    If you just want to insert your message in a db row and if thoses keys are necessary then yes, store everything for each message. But if there is no "business" validation of the message to transform those commands into events, then its not eventsourcing, its just persisting your messages. – Manel Mar 15 '17 at 10:41