Questions tagged [flink-sql]

Apache Flink features two relational APIs, SQL and Table API, as unified APIs for stream and batch processing.

Apache Flink features two relational APIs:

  1. SQL (via Apache Calcite)
  2. Table API, a language-integrated query (LINQ) interface

Both APIs are unified APIs for stream and batch processing. This means that the a query returns the same result regardless whether it is applied on a static data set or a data stream. SQL queries are parsed and optimized by Apache Calcite (Table API queries are optimized by Calcite).

Both APIs are tightly integrated with Flink's DataStream and DataSet APIs.

667 questions
0
votes
1 answer

Flink SQL Client environment configuration to read CSV file as source streaming table

I want to try out the Match_Recognize operator in Flink SQL from the SQL client. For this, I have done the following setup for the source table # A typical table source definition looks like: - name: TaxiRides type: source update-mode:…
Ahmed Awad
  • 95
  • 8
0
votes
1 answer

Flink data type does not match when add time attributes by table source

I tried to add a table source with event time attribute according to flink doc. My codes like: class SISSourceTable extends StreamTableSource[Row] with DefinedRowtimeAttributes with FlinkCal with FlinkTypeTags { private[this] val…
K F
  • 645
  • 1
  • 6
  • 16
0
votes
0 answers

Flink JobGraph Submission

I am trying to compile the given SQL into Flink's Job Graph and want to submit to YARN. JobGraph jobGraph = streamExecutionEnv.getStreamGraph().getJobGraph(); YarnDeployer().deployJob(jobGraph); YarnDeployer is custom class which uses…
user1261215
0
votes
0 answers

Flink SQL Result field does not match requested type error on LocalDateTime

When i group the below select it getting type maching error. I already try to CAST as TIMESTAMP and try to change POJOs LocalDateTime type. Most of the sample codes converts as Row.class could not find any custom class example. SELECT name,…
C.T
  • 151
  • 2
  • 9
0
votes
1 answer

Optimized Top-N query using Flink SQL

I'm trying to run a streaming top-n query using Flink SQL but can't get the "optimized version" outlined in the Flink docs working. The setting is as follows: I've got a Kafka topic where each record contains a tuple (GUID, reached score, maximum…
0
votes
1 answer

How does the Flink SQL Client distinguish batch mode and stream mode?

As we all know, Flink has two core APIs (DataStream/DataSet), but when I use flink Sql Client to submit a job, I do not need to choose the stream or batch mode.So how Flink SQL Client decide to use batch mode and stream mode. I did not find the…
Hensom
  • 37
  • 1
  • 4
0
votes
1 answer

Unable to register Flink TableSource with external connector

I am trying to register flink source with the below code snippet. But failed with the exception. Exception in thread "main" org.apache.flink.table.api.TableException: findAndCreateTableSource failed. at…
0
votes
1 answer

Flink SQL : Outer Join with Group By gives unexpected output

I have two Flink dynamic tables Event and Configuration. Event have the structure : [id, myTimestamp] and Configuration have the structure : id, myValue, myTimestamp I am trying to do a Flink SQL query that return Event.id, Configuration.myValue, or…
Nakeuh
  • 1,757
  • 3
  • 26
  • 65
0
votes
1 answer

Flink SQL CURRENT_TIMESTAMP always return the same value

I am using the Flink SQL API in Flink 1.8. I have two stream tables Table1 and Table2. If we define receivedTime as the time where the data was received in a Table, I want to join Table1 and Table2 (on some id) and keep only the rows where…
Nakeuh
  • 1,757
  • 3
  • 26
  • 65
0
votes
1 answer

Flink SQL : Joining Tables with timestamp in pure SQL syntax

I have some troubles using the SQL syntax from Flink to join multiple tables when at least one of them have a time attribute column. I have a table Table1 that use the schema (id, value1, rowtime), when rowtime is used as a flink rowtime. I want to…
Nakeuh
  • 1,757
  • 3
  • 26
  • 65
0
votes
1 answer

Applying window based rules in Apache Flink Broadcast stream

I have a set of rules in my BroadcastStream in Apache Flink. I am able to apply new rules as they come to my stream of events. But I am not able to figure out how can I implement if my rules are like rule 1> alert when count of event a is greater…
sky
  • 260
  • 3
  • 12
0
votes
1 answer

Does Flink has a "distribute by" command (similar to spark)

I want to shuffle the data by key and write the data to sink. I'm not use group by because of it is no need for me do any aggregate. So I wonder whether flink sql has keyword like "distribute by" in spark.
yunfan
  • 760
  • 1
  • 11
  • 24
0
votes
1 answer

Flink broadcast state with more than 1 parallelism

Let me just put it out, I am a very beginner of Flink and trying to grab concepts as much as possible. Lets say, I have a flink cluster with 10 task managers. I have a flink job running on each. The job uses a broadcast state as well. This broadcast…
Gaurav Kumar
  • 1,091
  • 13
  • 31
0
votes
1 answer

Broadcast "JOIN" in Flink

Is there any way I can use Broadcast Join in FLINK the same way I used in SPARK. I'm working with JOINS but the data is large so I would require Broadcast Join. Thank You
ASK5
  • 55
  • 11
0
votes
0 answers

MATCH_RECOGNIZE example in Scala

I'm trying to make a CEP code using flink SQL, I would like to get started with this. It would be great if you could share a snippet of the code with me. MATCH_RECOGNIZE is what I want to use. Example on :-…
ASK5
  • 55
  • 11