Apache Flink is an open source platform for scalable batch and stream data processing. Flink supports batch and streaming analytics, in one system. Analytical programs can be written in concise and elegant APIs in Java and Scala. Apache Flink features Table API (and SQL API) as unified APIs for stream and batch processing.
Questions tagged [flink-table-api]
81 questions
1
vote
0 answers
Create Flink Datastream from Postgres table
I am trying to process large streams of data (source = Kinesis stream) and sink in to Postgres DB.
While doing so, I need to first join the incoming stream with some master data which is already present in the Postgres DB.
I am creating a keyed…

Swapnil Khante
- 547
- 2
- 10
1
vote
1 answer
List all Sources and Sinks in a Flink SQL job
I'm building a sort of wrapper around Flink SQL. I construct a job with a bunch of user-specified SQL statements with StreamTableEnvironment.sqlUpdate. Some are INSERTs, some are CREATEs. I also do some sqlQuerys.
Before I'm calling…

BenoitParis
- 3,166
- 4
- 29
- 56
0
votes
1 answer
Flink Table API program does not compile when assigning watermark using a field converted with UDF
Since TO_TIMESTAMP(value, format) method in Flink Table API does not support custom formats like yyyyMMddHHmmssSSS, we needed to create a UDF(User Defined Function) for custom conversion.
However, when we tried to use it, Flink Table api gave Table…
0
votes
0 answers
Output of User-Defined Table Function Cannot to Converted to Stream
I have a datastream that I am converting to a Table so that I may use Flink SQL syntax on the table. My Flink SQL query includes a user-defined Table Function, meaning each input row results in multiple output row, using a generator as in the…

c_mac
- 1
- 1
0
votes
0 answers
Flink SQL - NOT EXISTS query gives error - doesn't support consuming update and delete changes
I have data streaming from kafka on an input topic. Using that data, I have to query another table (either with kafka connector or file system connector) and check if a value does not exist and finally publish the to an output kafka topic.
I am…

Neha
- 225
- 1
- 5
- 12
0
votes
1 answer
job client must be a coordinationrequestgateway. this is a bug
We are using flink tableApi and while performing executeQuery operation, we run into this issue .
Basically everything works completely fine from local but when we run the same application using on flinkUI, we see below…

Divya Jain
- 1
- 1
0
votes
0 answers
await method on TableResult is not working when job is submitted via Session Mode using Apache Flink Operator
await method on TableResult is not working when job is submitted via Session Mode using Apache Flink Operator by creating FlinkSessionJob resource in kubernetes. The same code is working when the job is deployed using the application mode. Here is…

Vinay Cheguri
- 55
- 7
0
votes
0 answers
How can I continuously read data from Apache Kudu in real-time using Apache Flink?
I need to read data with Apache Flink from Apache Kudu Database in realtime.
My use-case is:
I receive a message from Kafka, deserialize that message and get an ID.
If ID exists is in the database, I ignore it
If isn't, I need to add it in…

marcos
- 1
- 1
0
votes
0 answers
Flink sql with LAG function NullPointerException
I want to make a query with the LAG function. And get Job Exception without any explanations.
Code:
private static void t1_LeadLag(DataStream ds, StreamExecutionEnvironment env) {
StreamTableEnvironment te =…

padavan
- 714
- 8
- 22
0
votes
1 answer
Flink LEAD LAG functions in Table API
Hello i see in documentation that Flink support Lead Lag function
https://nightlies.apache.org/flink/flink-docs-release-1.17/docs/dev/table/functions/systemfunctions/
But cant find how do apply in table api ?
Table win = t
…

padavan
- 714
- 8
- 22
0
votes
0 answers
Flink Table to DataStream with Model with partial fields
Hello i have a Model with many fields
public class UserModel {
public int userId;
public int count;
public int zip;
public LocalDateTime dt;
public LocalDateTime wStart;
public LocalDateTime wEnd;
}
I work with table,…

padavan
- 714
- 8
- 22
0
votes
1 answer
Consuming json message
I created a flink application using the table API to ingest data from a kafka topic (that I generate myself). The dataset is youtube stats from kaggle. I can see in confluent's UI that the topic is getting the message from my producer, and it looks…

Jorge Cespedes
- 547
- 1
- 11
- 21
0
votes
0 answers
Lookup join or enrichment join against a changelog stream in Apache Flink Table API
I'm interested in doing a "lookup join" or "enrichment join" against a "changelog stream" read by "upsert-kafka". I am wondering if this is possible against the table API. I found…

discord
- 59
- 10
0
votes
1 answer
Error in Implementing Flink SQL Processing Time Temporal Left Join
I have a stream of data coming from Kafka which I want to enrich with a static data stored in Parquet files in Hadoop, and finally write to a Filesystem sink.
Initially I tried a lookup join as below,
SELECT t1.*,t2.enrichment_data_col from…

rony
- 1
- 1
0
votes
0 answers
Flink: Wait for Kafka CDC to playback from earliest-offset at start up
I am trying to implement a way to wait until the table kafka-connector has caught up with the latest offset.
I've already tried implementing a session gap window operator on that table (kafka-connector to a compact topic) to wait for inactivity on…

taricjain
- 1
- 1