0

On a use case that a microsevice publishes/emitts an event that may contain a payload of hundreds of thousands of records, how would you pass this on your event store?

Are there any practices around this?

Is it a good idea to split the event in batches? (Even tho the event happened all of at once.)

Or publishing an event with this size should generally be avoided?

froi
  • 7,268
  • 5
  • 40
  • 78
  • Could you be more specific on what type of command generates such a huge payload? Are you simply uploading a document? Could the data be easily re-generated from an existing state and a versioned algorithm so that the client would only have to get the initial state, the algorithm version that was ran and then re-generate the data using a shared library? Will this even be external only or used to rehydrate the AR? – plalx Jun 10 '19 at 14:18
  • Let's say the payloaad is a large list of individual messages. – froi Jun 12 '19 at 06:32

1 Answers1

0

If you are using Apache Kafka, you can implement a serializer that compresses the payload and return a compressed byte array.

Ziv
  • 11
  • 1