Questions tagged [ktable]

A KTable is continuously updating materialized view.

The KTable concepts is used by Kafka Streams and ksqlDB (both also used the concept of a KStream). A KTable is a materialized view that that continuously updated.

The data source of a KTable can either be a topic in Apache Kafka (i.e., each record in the topic is "upserted" into the KTable) or the KTable stores the result of a continuous query over an input KStream or KTable.

72 questions
1
vote
1 answer

Kafka Streams API GroupBy behaviour

I am new in kafka streams and I am trying to aggregate some streaming data into a KTable using groupBy function. The problem is the following: The produced message is a json msg with the following format: { "current_ts": "2019-12-24…
Chris_Gav
  • 112
  • 2
  • 8
1
vote
1 answer

Querying a Kafka Streams KTable using KSQL

I'm writing this Kafka streams application that takes the sensor readings that are being registered in a Kafka topic (as messages in JSON), and performs some aggregations on the value of those readings in a per-minute, per-hour and per-day basis.…
LeandroOrdonez
  • 195
  • 2
  • 13
0
votes
0 answers

State store vs Ktable in Kafka Streams

I'm new to Kafka and Kafka Streams. While I have gone through the concepts of Kafka and Kafka Streams and feel confident conceptually, there is one thing that's confusing me in and out. Yes, it's the decision between using Ktable vs state store. I'm…
eureka19
  • 2,731
  • 5
  • 26
  • 34
0
votes
0 answers

KStream join with KTable record drops if key not exist in KTable

I have a use case in which I don't want to drop the record from the KStream if KTable key doesn't exist & I want to wait for the KTable key (record) to arrive & then sending fully populated data to a new kafka topic. I have found a reference in…
Sharry India
  • 341
  • 1
  • 9
0
votes
0 answers

Why does a ktable emit two events on the changelog?

I'm currently working on a project where we have a ktable on which we apply a map-reduce pattern to the table using the groupBy() to map to a new key and then count(). We then emit changes onto a stream. I would've expected that for every input…
andrew Patterson
  • 559
  • 2
  • 6
  • 19
0
votes
1 answer

Creating GlobalKTable using only subset of topic columns

We use kafka streams to process one input stream and one compacted topic with client data. In our stream processing we consume the first one and join it with the second one using a GlobalKTable, something like StreamsBuilder builder = new…
user992990
  • 41
  • 5
0
votes
1 answer

How do you get the latest offset from a remote query to a Table in ksqlDB?

I have an architecture where I would like to query a ksqlDB Table from a Kafka stream A (created by ksqlDB). On startup, Service A will load in all the data from this table into a hashmap, and then afterward it will start consuming from Kafka Stream…
Farhan Islam
  • 609
  • 2
  • 7
  • 21
0
votes
0 answers

How to get list of keys based on a field in value from Kafka state store

I have a state store which is defined like below: StoreBuilder> indexStore = Stores.keyValueStoreBuilder( Stores.persistentKeyValueStore("Data"), Serdes.String(),…
cppcoder
  • 22,227
  • 6
  • 56
  • 81
0
votes
1 answer

ksqlDB deleting records from KTable

• We have a topic “customer_events“ in Kafka. Example of value. { "CUSTOMERID": "198fa518-1031-4fe8-8abd-ca29bd120259" } • We created a persistent stream over the topic in ksqlDB cluster in Confluent. CREATE STREAM TEST_STREAM (SESSIONID STRING…
0
votes
0 answers

org.apache.kafka.common.errors.NotEnoughReplicasException: Messages are rejected since there are fewer in-sync replicas than required. in KTable

I'm developing KTable example looking at the link: https://cognizant.udemy.com/course/kafka-streams-real-time-stream-processing-master-class/learn/lecture/14244016#questions. Error: >HDFCBANK:1250.00 >[2022-11-08 17:07:45,505] WARN [Producer…
Jeff Cook
  • 7,956
  • 36
  • 115
  • 186
0
votes
1 answer

Apache Kafka - Implementing a KTable and producing event using CloudEvent

I have an implementation related to KTable and using CloudEvents to produce events, but for some unknown reasons, the produced event from KTable is not formatted based on CloudEvent. The implementation is as below: public void initKafkaStream()…
0
votes
0 answers

How to resolve many to many relationship in KTable

We have the following setup: 2 different json-messages that are getting published on different topics customer: kafka-key = "customer-1" { transferType: "customer" emailAddresses: ["email-1","email-2"] externalId:…
Nussbam
  • 31
  • 6
0
votes
1 answer

KSQL Table timestamp for one of table is not populating as intended

I am building a ktable like below with timestamp as one of the column in the underlying stream. But after creation rowtime is not populating with timestamp column value. CREATE TABLE MTL_PARAMETERS_TT WITH (KAFKA_TOPIC='MTL_PARAMETERS_TT',…
0
votes
0 answers

How to filter the data based on the headers key in the KTable?

I've set a few custom keys in the Kafka headers timeStamp: myKey1:1 This is my KTABLE KTable table1=streamsBuilder.table("MYTOPIC"); I need to filter this table based on the header's key timeStamp. How to achieve…
dark ninja
  • 33
  • 5
0
votes
1 answer

Apple M1 - Error opening store caused by RocksDBException: Column family not found when joining KStream to KTable

I am trying to leftJoin events from 2 streams. Initially, I joined 2 KStreams and everything was working fine. However, when I try to convert the second stream to a KTable, I get an error. Here is the code with the 2nd stream transformed to a…
yana
  • 11
  • 2