Questions tagged [flink-sql]

Apache Flink features two relational APIs, SQL and Table API, as unified APIs for stream and batch processing.

Apache Flink features two relational APIs:

  1. SQL (via Apache Calcite)
  2. Table API, a language-integrated query (LINQ) interface

Both APIs are unified APIs for stream and batch processing. This means that the a query returns the same result regardless whether it is applied on a static data set or a data stream. SQL queries are parsed and optimized by Apache Calcite (Table API queries are optimized by Calcite).

Both APIs are tightly integrated with Flink's DataStream and DataSet APIs.

667 questions
0
votes
1 answer

How to update table schema when there is new Avro schema for Kafka data in Flink?

We are consuming a Kafka topic in the Flink application using Flink Table API. When we first submit the application, we first read the latest schema from our custom registry. Then create a Kafka Datastream and Table using Avro schema. My data…
lalala
  • 63
  • 1
  • 6
0
votes
1 answer

How to create a refreshable table using in-memory data in Flink for joins?

I have a Flink application that I rely on Table API. I do have a Kafka topic that I create a table. Then, we maintain an S3 object for list of IP addressed and some metadata information. We also want to create a table on this S3 object. S3 object…
lalala
  • 63
  • 1
  • 6
0
votes
1 answer

How to understand streaming table in Flink?

It's hard for me to understand the streaming table in Flink. I can understand Hive, map a fixed, static data file to a "table" but how to embody a table built on streaming data? For example, every 1 second, 5 events with same structure are sent to a…
user2894829
  • 775
  • 1
  • 6
  • 26
0
votes
1 answer

Watermark fell far behind in Flink CEP

I am using Flink CEP to detect patterns against events from Kafka. For simplicity, events only have one type. I am trying to detect the change in the value of a field in the continuous event stream. The code looks like the following val streamEnv =…
Grant
  • 500
  • 1
  • 5
  • 18
0
votes
1 answer

how to extract column&table lineage from flink sql

I want to build a lineage system for a real-time data warehouse,how can I extract table and column lineage from flink sql?
LiJianing
  • 32
  • 4
0
votes
2 answers

Caused by: java.lang.ClassNotFoundException: org.apache.orc.storage.ql.exec.vector.VectorizedRowBatch appears in flink query hive

Software version: flink 1.11 hive1.2.1 hadoop2.7.1 Use flink run jar to run the submission program with the following exceptions org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy at…
小墨鱼
  • 11
  • 4
0
votes
0 answers

How Flink encodes state data in CEP scenario?

I am using Flink CEP to recognize some event patterns, the query looks like select * from $TABLE MATCH_RECOGNIZE( partition by $PARTITION_FIELDS order by event_time measures ... one row per match …
0
votes
1 answer

Flink Idle State Retention based on event time

This might be a simple question to answer, but I couldn't find it stated explicitly in the docs: is Flink's idle state retention computed based on event time when using…
Adriank
  • 3
  • 1
0
votes
1 answer

How to aggregate data group by week in FlinkSQL

If I want aggregate data group by day, sql: select DATE_FORMAT(ctime, 'yyyyMMdd'), count(*) as num from event group by DATE_FORMAT(ctime, 'yyyyMMdd'); how to aggregate data group by week?
rayjun
  • 33
  • 5
0
votes
1 answer

Flink SQL Unit Testing: How to Assign Watermark?

I'm writing a unit test for a Flink SQL statement that uses match_recognize. I'm setting up the test data like this Table data = tEnv.fromValues(DataTypes.ROW( DataTypes.FIELD("event_time", DataTypes.TIMESTAMP(3)), DataTypes.FIELD("foobar",…
0
votes
1 answer

Checkpointing in Flink is not working with CoFlatMapFunction

Hi i am trying to do checkpointing in one of my flink module in which i am using CoFlatMapFunction to combine to streams if i comment out the CoFlatMapFunction checkpointing is working if uncomment again its not working. i updated the Checkpointing…
YRK
  • 153
  • 1
  • 1
  • 22
0
votes
1 answer

How does Flink ensure the order of data between operators?

In streaming system, the order of data is big problem. we know that in flink, handle the data out of order, is use window and watermark. But, in the inner flink, between operators, how to guaranteed the order of data? Can flink ensure that advanced…
chen amos
  • 101
  • 2
  • 7
0
votes
0 answers

flink table api day year month extraction

How can i extract day hour and time from frilk table api final Table select = transactions .window(Tumble.over(lit(1).hour()).on($("transaction_time")).as("log_ts")) …
msd
  • 93
  • 6
0
votes
1 answer

Exception in Flink TableAPI

I am trying to run below simple Flink job to count words using TableAPI. Used DataStream API to read a stream of data and used StreamTableEnvironment API to create a Table environment. I am getting below exception. Can someone please help me what is…
Shailendra
  • 347
  • 6
  • 21
0
votes
1 answer

Flink Create View or Table as Select

I was reading the Flink SQL docs and in the section on Create, I could not find anything that resembles CREATE VIEW AS SELECT nor CTAS. I looked a bit further and found the following: Flink SQL allows you to Select Flink SQL allows you to…
Dennis Jaheruddin
  • 21,208
  • 8
  • 66
  • 122