Questions tagged [flink-sql]

Apache Flink features two relational APIs, SQL and Table API, as unified APIs for stream and batch processing.

Apache Flink features two relational APIs:

  1. SQL (via Apache Calcite)
  2. Table API, a language-integrated query (LINQ) interface

Both APIs are unified APIs for stream and batch processing. This means that the a query returns the same result regardless whether it is applied on a static data set or a data stream. SQL queries are parsed and optimized by Apache Calcite (Table API queries are optimized by Calcite).

Both APIs are tightly integrated with Flink's DataStream and DataSet APIs.

667 questions
0
votes
1 answer

How to complete aggregation task with flink cep

I need to count the number of times in a day that A happens and in 15 minutes that B happens。 The stream maybe A1 ,A2,B1,B2,A3,B3,B4,B5,A4,A5,A6,A7,B6。 In my case the event results are A2,B1 A3,B3 A7,B6。 And I need receive realtime result…
B.king
  • 1
0
votes
1 answer

Flink SQL Job runs out of heap space

I am running a query to join a stream and a table as below. It is running out of heap space. Even though it has enough heap space in flink cluster (60GB * 3) Is there an eviction strategy needed for this query ? SELECT sourceKafka.* FROM…
0
votes
1 answer

Flink: Convert a retracting SQL to an appending SQL, using only SQL, to feed a temporal table

I am providing a Flink SQL interface to users, so I can't really use the Table or Java/Scala interface. Everything needs to be specified in SQL. I can parse comments in the SQL files though, and add specified ad hoc lower level API instructions. How…
0
votes
1 answer

How do we use query configurations while using SQL client in Flink SQL?

How do we use query configurations while using SQL client in Flink SQL? The same fashion as mentioned in the link below for https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/table/streaming/query_configuration.html Want to use Idle…
0
votes
1 answer

How do we window join using SQL client in Flink SQL query?

How do we window join using SQL client in Flink SQL query. Windowing in the same fashion as mention in link below https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/stream/operators/joining.html Sample Query that requires…
0
votes
0 answers

Flink SQL Match_Recognize giving incomplete results

I have the following data given to Flink as a stream ID Val eventTime.rowtime 266 25 9000 266 22 10000 266 19 11000 266 18 12000 266 16 13000 266 15 14000 266 14 15000 266 13 16000 266 14 17000 266 15 18000 266 17 19000 266 18 20000 266…
0
votes
1 answer

Flink Autojoin with rowtime column

I have a Flink table with the following structure : Id1, Id2, myTimestamp, value Where the rowtime is based on myTimestamp. I have the following processing that works well : Table processed = tableEnv.sqlQuery("SELECT " + "Id1, "…
Nakeuh
  • 1,757
  • 3
  • 26
  • 65
0
votes
0 answers

Event processing by using Flink SQL API

My Use case- Collect events for a particular duration and then group them based on the key Objective After processing, user can save data of particular duration based on the key How i am planning to do 1)Receive events from Kafka 2)Create data…
flinkuser
  • 1
  • 1
0
votes
1 answer

Flink Scala NotInferedR in scala Type mismatch MapFunction[Tuple2[Boolean,Row],InferedR]

I am trying to converted Tuple2[Boolean,Row] to Row in flink in MapFunction which is failing with an error. when I try to run I get another error. Code What I am trying to do val data = kinesis.map(mapFunction) …
0
votes
1 answer

How to query flink's queryable state

I am using flink 1.8.0 and I am trying to query my job state. val descriptor = new ValueStateDescriptor("myState", Types.CASE_CLASS[Foo]) descriptor.setQueryable("my-queryable-State") I used port 9067 which is the default port according to…
igx
  • 4,101
  • 11
  • 43
  • 88
0
votes
1 answer

Dynamic SQL Query in Flink

I have a SQL query like this String ipdetailsSql = "select sid, _zpsbd6 as ip_address, ssresp, reason, " + "SUM(CASE WHEN botcode='r1' THEN 1 ELSE 0 END ) as icf_count, " + "SUM(CASE WHEN botcode='r2' THEN 1 ELSE 0 END ) as…
Ravi Shanker Reddy
  • 475
  • 1
  • 6
  • 21
0
votes
1 answer

Create SQL Table from a DataStream in a Java/Scala program and query it from SQL Client CLI - Apache Flink

Is it possible interact with a table using Flink SQL client CLI where which table was created within a Scala/Java program running in the cluster?
salc2
  • 577
  • 5
  • 14
0
votes
1 answer

How to check if a string is a number in flink sql

In flink sql, how to check whether a string is a number, as select * from input where str like '\\d+' the regular expression seems not useful, and the 'similar to' operator can't work either. Is there some idea?
lzh
  • 83
  • 8
0
votes
1 answer

Why do my Flink SQL queries have very different checkpoint sizes?

When using Flink Table SQL in my project, I found that if there was any GROUP BY clause in my SQL, the size of the checkpoint will increase vastly. For example, INSERT INTO COMPANY_POST_DAY SELECT sta_date, company_id, company_name FROM …
Ruoyu Dai
  • 53
  • 4
0
votes
1 answer

use Flink to process kafka messages in the past 10 minutes?

We are considering to use Flink SQL for ad hoc analytics on real-time kafka data in the past 5 - 10 minutes. To achieve that, it seems that we need to extend the Kafka connector to have it only read messages in a given period of time, and use that…
yuyang
  • 1,511
  • 2
  • 15
  • 40