Questions tagged [apache-kafka-streams]

Related to Apache Kafka's built-in stream processing engine called Kafka Streams, which is a Java library for building distributed stream processing apps using Apache Kafka.

Kafka Streams is a Java library for building fault-tolerant distributed stream processing applications using streams of data records from topics in Apache Kafka.

Kafka Streams is a library for building streaming applications, specifically applications that transform input Kafka topics into output Kafka topics (or calls to external services, or updates to databases, or whatever). It lets you do this with concise code in a way that is distributed and fault-tolerant.

Documentation: https://kafka.apache.org/documentation/streams/

3924 questions
1
vote
1 answer

Error while creating KTable with custom key

Use-case - There is a topic with messages (null, Metadata). I need to create a Ktable from the topic with the key (metadata.entity_id) and value as metatdata. This table will be later used to do a join with a stream with the same key. private…
Jack
  • 111
  • 1
  • 6
1
vote
1 answer

Kafka streams application is still trying to create changelog topic though i am setting the optimization property

Below is the code snippet i am using, `streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "wordcount-live-test"); streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "brokerIP:port"); …
Swapnil Dixit
  • 101
  • 1
  • 8
1
vote
1 answer

kafka streams rocksdb jmx metrics

I am trying to debug a performance issue related to kafka streams stateful application(We use processor API only). The application queries a number of statestores (close to 55). So, in order to find out the reason for slow processing, i was looking…
SunilS
  • 2,030
  • 5
  • 34
  • 62
1
vote
0 answers

Kafka Streams Join after aggregation not working with multiple partition

Problem Statment: Topic 1: "key = empId, value = empname, deptName, ..." Topic 2: "key = deptName, value = deptName" I need the data from Topic 1 where the deptName(value attribute in topic 1) is equal to key of Topic 2. Steps: Create a stream from…
Vijay
  • 11
  • 1
1
vote
1 answer

Can KStream-GlobalKTable join return multiple matching records for a specific search?

I’m hoping someone can help with an issue I’m having regarding GlobalKTables in Kafka. I”m trying to perform a KStream-GlobalKTable join. However, I want to retrieve all entries in the GlobalKTable whose Key or Value contains a string found in my…
jayz_135
  • 11
  • 2
1
vote
2 answers

ClassCastException in spring-kafka-test using `merger()`

I want to test my Kafka Streams topology with a unit test using kafka-streams-test-utils. I'm using this library already a longer time and I built already some abstract layer around my tests using TestNG. But since I added a merge(...) to my Stream,…
Norbert Koch
  • 533
  • 6
  • 17
1
vote
0 answers

Kafka Streams Window Store keeping Data for Much Longer then Retention Period

The use case is to flush records from partitions which did not receive new data in kafka streams as we are using suppress which requires stream time. So We have a Window Store with tumbling window of 1 minute with reduce operation attached with…
Jay Ghiya
  • 424
  • 5
  • 16
1
vote
1 answer

Materialize kafka Stream after a flatMap

I want to consume form two Kafka topics with Kafka Streams supported by Spring Kafka. The Topics have a different key and value. I want to map the key and value from the second topic and merge it with the first one via the method:…
1
vote
0 answers

Write to websocket from Kafka KTable (GlobalKTable)

Is there any way to send messages directly from KTable to websocket in Kafka? I have data that are constantly updating. I keep them in Kafka compacted topic. When new websocket customer arrive I would like to send him a current state of my data,…
1
vote
1 answer

Why do some joins not work without selectKey first?

In doing my joins, I am finding that the 2nd block tends to give the expected result, whereas the 1st block does not and never hits the (aValue, bValue) -> myFunc(aValue, bValue). I didn't think the actual key mattered as long as I set the right…
atkayla
  • 8,143
  • 17
  • 72
  • 132
1
vote
0 answers

Kafka state of the world/snapshot + subscribe to updates

Is there a way in Kafka for a client to subscribe to a topic to get a snapshot/state of the world "KTable" when first connect and also subscribe to the subsequent updates? Say we have a Topic of records like below |Key | Value…
1
vote
1 answer

Updating global store from data within transform

I currently have a simple topology: KStream eventsStream = builder.stream(sourceTopic); eventsStream.transformValues(processorSupplier, "nameCache") .to(destinationTopic); My events sometimes have a key/value pair and…
1
vote
0 answers

Foreign key joining two ktable on streaming database updates

We have two database tables which streams updates to kafka topics through a cdc application. We will keep every row's last version on Ktables. Then join them and write to another kafka topic on any update. Our code looks like this: pTable =…
hasan
  • 11
  • 3
1
vote
0 answers

About KStream join KTable in kafka stream

I want to join two streams, one as KStream, another as KTable. However, when I send a message to stream, a message to table, then join operation does not occur, when I send a message to stream, join does occur, why is that?
zydzjy
  • 51
  • 3
1
vote
1 answer

Spring kafka re-create Kafka Stream Topology in runtime

I have an application that is based on spring boot, spring-kafka and kafka-streams. When application starts up, it creates kafka streams topology with default list of topics. What I need to do is edit/recreate topology in runtime. For example, when…
Igor Dumchykov
  • 347
  • 1
  • 15