I created a topic
where I store change events of an customer entity. To process the content of that topic, I created a stream with
CREATE STREAM customerstream(event VARCHAR,content MAP<STRING,STRING>) WITH (KAFKA_TOPIC='mycustomers',VALUE_FO
RMAT='JSON');
The content of the stream looks like the following
|EVENT |CONTENT |
+--------------------------------------------------------+--------------------------------------------------------+
|create |{name=bob, location=NY, id=1} |
|update |{location=AM, id=1} |
|update |{location=BER, id=1} |
|update |{name=bob_new, id=1} |
As you can see, the topic consists of simple create, delete and update events. I could now write a client to re-construct the entity out of the events. However, I am wondering, if there is a smarter way to do this in ksql so that I do not have to write a client.
The result should look like this
{
name:'bob_new',
location:'BER',
id:1
}
How would the client look when also delete and re-create events would exist?
EDIT:
With updates only the following query works. But it fails if an entity is deleted and later on re-created with different values.
select content['id'],latest_by_offset(content['location'],latest_by_offset(content['name'])) from customerstrea
m2 group by content['id'] emit changes;
Example:
|EVENT |CONTENT |
+--------------------------------------------------------+--------------------------------------------------------+
|create |{name=bob, location=NY, id=1} |
|update |{location=AM, id=1} |
|delete |{id=1} |
|create |{name=new_perosn, id=1} |