Questions tagged [ktable]

A KTable is continuously updating materialized view.

The KTable concepts is used by Kafka Streams and ksqlDB (both also used the concept of a KStream). A KTable is a materialized view that that continuously updated.

The data source of a KTable can either be a topic in Apache Kafka (i.e., each record in the topic is "upserted" into the KTable) or the KTable stores the result of a continuous query over an input KStream or KTable.

72 questions
1
vote
1 answer

Metadata for key is wrong even if the key is present in local Ktable in kafka streams when running two instances

I am facing a weird issue with when aggregating records into a Ktable. I have a following scenario in my system. There are two kafka streams application running on different nodes (having the same application id but having different application…
nickp12
  • 21
  • 4
1
vote
0 answers

Cross-reference a section inside a field in a table?

Is it possible to cross-reference a section inside a field in a table with omething like the following? # using "asis" results a <- c(1,2) b <- c("\@ref(some-section-id)") df <- data.frame(a,b) kableExtra::kable(df, format = "markdown") When…
Our
  • 986
  • 12
  • 22
1
vote
1 answer

Kafka Streams API: Avoid additional stateStore in KTable.mapValues

Currently we use the following in our Kafka Streams application: streamsBuilder.table(inputTopic) .join(...) .mapValues(valueMapper) // <-- this causes another state store .groupBy(...) …
Coperator
  • 81
  • 1
  • 5
1
vote
1 answer

KTable and KStream Space considerations understanding

We are performing the POC with KSQLDB and some doubts :- I have a Kafka topic named USERPROFILE which have around 100 million unique records and 10 days retention policy. This Kafka topic continues to receive INSERT/UPDATE type of events in…
1
vote
0 answers

Design system to consume kafka stream for daily/monthly reporting

I am working on data integration project where we need to consume kafka stream of business events but produce daily and monthly reports. We require some sort of state store for stream. The approach we brainstormed so far are : Use ktable to store…
1
vote
2 answers

Overriding KStreams default serializer (ByteArraySerializer)

I can't seem to override the serializer of a topic to Serdes.String(). I'm trying a simple use case of reading from a topic (stream), and writing to a KTable. What I have so far: @Component class Processor { @Autowired public void…
Tiberiu
  • 990
  • 2
  • 18
  • 36
1
vote
1 answer

Kafka KTable Materialized-State-Store control

We materialize the KTable into a Internal-State-Store. a.) How and where can I specify that, this Internal-State-Store should be Persistent and be automatically backed-up to another kafka topic ? b.) How can we specify that, this…
Aditya Goel
  • 201
  • 1
  • 15
1
vote
0 answers

Join one-to-many relation with spring cloud kafka stream

I'm trying to join data from two topics person and address where one person can have multiple addresses. The data published into the topics look like the following: //person with id as key {"id": "123", "name": "Tom Tester"} //addresses with id as…
1
vote
1 answer

How to remove old records from a state store using a punctuator? (Kafka)

I've created a Ktable for a topic using streamsBuilder.table("myTopic"), which I materialize to a state store so that I can use interactive queries. Every hour, I want to remove records from this state store (and associated changelog topic) whose…
user6429576
1
vote
1 answer

Query KTable in the same Application where it is created

I have an Kafka streams application in which I read from a topic, do aggregation and materialize in a KTable. I then create a Stream and run some logic on the stream. Now in the stream processing, I want to use some data from the aforementioned…
ab m
  • 422
  • 3
  • 17
1
vote
0 answers

Kafka log compaction frequently updated keys never get consumed

Based on Kafka documentation https://kafka.apache.org/documentation/#compaction The order of client consumption based on offset: before compaction is K1,K2,K3,K4,K5,K6 and after compaction it becomes K1,K3,K4,K5,K2,K6 So am I right saying that (if…
user2001850
  • 477
  • 4
  • 18
1
vote
1 answer

KStream - KTable Join not triggering

I’ve 2 topics(actually more but keeping it simple here) which I am joining using Streams DSL and once joined, publishing data to downstream.  I am creating a KTable on top of Topic 1 and storing it into a named state store. Key for Topic1 looks…
user123
  • 281
  • 1
  • 3
  • 16
1
vote
2 answers

GlobalKTable Refresh Logic

When there are updates done to an underlying topic of a GlobalKTable, what is the logic for all instances of KStream apps to get the latest data? Below are my follow up questions: Would the GlobalKTable be locked at record level or table level when…
guru
  • 409
  • 4
  • 21
1
vote
0 answers

Kafka state of the world/snapshot + subscribe to updates

Is there a way in Kafka for a client to subscribe to a topic to get a snapshot/state of the world "KTable" when first connect and also subscribe to the subsequent updates? Say we have a Topic of records like below |Key | Value…
1
vote
1 answer

Creating Global State Store in Kafka Streams (Spring)

I am new to Kafka and have tried to create a small Kafka KTable implementation. I have successfully added a KTable and was able to query. I have used local state store and it worked as expected. Below is my Local State Store Config @Bean(name =…