Questions tagged [ktable]

A KTable is continuously updating materialized view.

The KTable concepts is used by Kafka Streams and ksqlDB (both also used the concept of a KStream). A KTable is a materialized view that that continuously updated.

The data source of a KTable can either be a topic in Apache Kafka (i.e., each record in the topic is "upserted" into the KTable) or the KTable stores the result of a continuous query over an input KStream or KTable.

72 questions
0
votes
1 answer

KTable vs KSqlDb

i'd like to understand difference between KTable and KsqlDb. I need two data flows from my "states" topic: Actual snapshot of a state as key-value store Subscription to events of state data changes I may created compacted-topic and use KTable as…
alex
  • 13
  • 3
0
votes
1 answer

KTable causes unsubscribe from topics

I'm writing a basic Kafka streams app in Java which reads wikipedia events provided by a producer and attempts to count the amount of created and recently changed events according to user type (bot or human). I created a custom serdes for the…
meirgold
  • 83
  • 6
0
votes
1 answer

KTable GroupBy Function Overload in Kotlin

Attempting to replicate a simple word count example for Kafka Streams. val groupedByWord: KTable = source .flatMapValues { value: String -> listOf( …
Mishka
  • 57
  • 3
0
votes
0 answers

How to Query Kafka table periodically?

I have use case where there are events comming from a topic pattern (myTopic-cusId). I have set up a kafka stream listener which listen to the incoming events from these topics. What I have done so far is build a stream from the topic and created a…
nickp12
  • 21
  • 4
0
votes
1 answer

KStream - KTable Join - Patterns to Update Already Enriched Records when Reference Data on KTable Side Gets an Update

Question on the KStream - KTable joins. Usually this kind of join is used for data enrichment purposes where the KTable provides reference data. So the question is, when the KTable record gets an update, how do we go about updating the older records…
0
votes
0 answers

Kafka How to process non-duplicate and ordered messages

can anyone please help. I have below requirement. Requirement: Process non-duplicate, order chat-messages and make them a bundle per ProgramUserId, and here is the process and involved topics. Data set up: ProgramUserId can have any number of…
Santhi
  • 1
  • 1
0
votes
0 answers

access to a globalktable or aglobalstore from a different application - kstream

we have 2 microservices: responsible for consuming message and update a globalKtable with configuration information (key=id, value=myObject) once it is activated with some id as an input it should look into the configuration globalKtable and…
Violet
  • 1
  • 1
0
votes
2 answers

Sliding window using Faust

Does anyone know how to implement a sliding window using Faust? The idea is to count the occurances of a key in a 10, 30, 60, and 300s window, but we need that on a 1s or on every update basis. I have a dodgy workaround, which seems very inefficient…
Fonty
  • 239
  • 2
  • 11
0
votes
0 answers

KTable not updating (immediately) when new messages are put on input stream

I have a kafka topic of Strings with an arbitrary key. I want to create a topic of characters in string : value pairs, e.g: input("key","value") -> outputs (["v","value"],["a","value"],...) To keep it simple, my input topic has a single partition,…
0
votes
2 answers

KTable data - Bytes printing

I have a KTable in Kafka that when I debug I get the data in bytes. If I want it in String, what should I do? I have attached an snippet as well.
anmol
  • 1
0
votes
1 answer

Printing Kafka KTable Data

I have this method which input's a JSON payload. I want to create a K-table out of this payload, read the data out of it and print it. So far, I am to create the KTable, but when I iterate over it, the control skips over it. Please can anyone help…
anmol
  • 1
0
votes
1 answer

Implement SQL update with Kafka

How can I implement an update of the object that I store in Kafka topic / Ktable? I mean, if I need not the replacement of the whole value (which compacted Ktable would do), but a single field update. Should I read from the topic/Ktable,…
Vladimir Nabokov
  • 1,797
  • 1
  • 14
  • 19
0
votes
0 answers

How to consume a high volume topic as KTABLE without exhausting memory/disk space?

We have a kstreams app doing kstream-kstable inner join. Both the topics are high volume with 256 partitions each. kstreams App is deployed on 8 nodes with 8 GB heap each right now. The state store (rocksdb) persists to disk and we are running out…
user2221654
  • 311
  • 1
  • 7
  • 20
0
votes
1 answer

Getting Out of Memory exception possibly due to the ktable related state store

We have a kstreams app doing kstream-kstable inner join. Both the topics are high volume with 256 partitions each. kstreams App is deployed on 8 nodes with 8 GB heap each right now. We see that the heap memory keeps constantly growing and eventually…
user2221654
  • 311
  • 1
  • 7
  • 20
0
votes
1 answer

Kafka Stream - Filter by client_id

I'm using Kafka Stream to create a ktable only with data specific to client_id, which is not the topic key. I'm new to Kafka Streams that seems pretty straightforward but I got a bit confused into the multiples examples available in the community…
mvitor
  • 37
  • 8