Apache Flink is an open source platform for scalable batch and stream data processing. Flink supports batch and streaming analytics, in one system. Analytical programs can be written in concise and elegant APIs in Java and Scala. Apache Flink features Table API (and SQL API) as unified APIs for stream and batch processing.
Questions tagged [flink-table-api]
81 questions
0
votes
1 answer
Flink SQL Unit Testing: How to Assign Watermark?
I'm writing a unit test for a Flink SQL statement that uses match_recognize. I'm setting up the test data like this
Table data = tEnv.fromValues(DataTypes.ROW(
DataTypes.FIELD("event_time", DataTypes.TIMESTAMP(3)),
DataTypes.FIELD("foobar",…

Aeden
- 195
- 9
0
votes
1 answer
Integration DataStreamAPI and TableAPI
In addition to this question I've create this example to integrate the DataStreamAPI and the TableAPI and this time I has no error, and I have two jobs instead of one, one is created for the DataStreamAPI which is running perfect, and the other job…

Alter
- 903
- 1
- 11
- 27
0
votes
1 answer
Apache-Flink 1.11 Unable to use Python UDF through SQL Function DDL in Java Flink Streamming Job
In Flip-106 there is an example of how to call a user-defined python function in a batch job java application through SQL Function DDL...
BatchTableEnvironment tEnv =…

Jonathan Figueroa
- 31
- 2
0
votes
1 answer
What backend does Flink Table API use ? Does it require any relational DB?
I'm fairly new to Flink and trying to understand appropriate use cases where Stream API/ Table API can be used. As part of it trying to understand
like Stream API, does Table API has the flexibility to choose the type of state backend it can…

ardhani
- 303
- 1
- 11
0
votes
2 answers
Simple TableAPI SQL query doesn't work on Flink 1.10 and Blink
I want to define Kafka connector using TableAPI and run SQL over such described table (backed by Kafka). Unfortunately, it seems that Rowtime definition doesn't work as expected.
Here's a reproducible example:
object DefineSource extends App {
…

bottaio
- 4,963
- 3
- 19
- 43
-1
votes
1 answer
How can I map each field of an event to a separate column when creating a table using Flink Table API from a data-stream of Confluent Avro?
How to create table using Table Api in flink for modelling
I am working on creating a table using Table API from the data-stream of confluent-avro (from kafka topic). I am trying using the below code ( Flink version is 1.17) and it is creating table…