Questions tagged [flink-table-api]

Apache Flink is an open source platform for scalable batch and stream data processing. Flink supports batch and streaming analytics, in one system. Analytical programs can be written in concise and elegant APIs in Java and Scala. Apache Flink features Table API (and SQL API) as unified APIs for stream and batch processing.

81 questions
0
votes
1 answer

Java Flink 1.13 Exception trying to use tableAPI

This is the error I'm getting trying to initialize StreamTableEnvironment private final StreamTableEnvironment tableEnv = StreamTableEnvironment.create(env); Exception in thread "main" java.lang.NoSuchMethodError:…
0
votes
1 answer

Datatype extraction not working when registering UDF with .createTemporaryFunction()

I have a table "vertices" with a custom datatype "Properties", which implements a HashMap and is interpreted by Flink as RAW('org...impl.properties.Properties', '...') datatype. PropertyValue is also a custom datatype.…
Max
  • 11
  • 2
0
votes
1 answer

Flink Table print connector not being called

I am using the Flink table API to pull data from a kinesis topic into a table. I want to periodically pull that data into a temporary table and run a custom scalar function on it. However, I notice that my scalar function is not being called at…
0
votes
1 answer

Flink failed to trigger checkpoint when using table API

My flink streaming application (v1.14.4) contain JDBC connector used for initial fetch data from MySQL server Logic: JDBC table source -> select.where() -> convert to datastream Kafka datastream join jdbc table -> further computation When I run…
0
votes
1 answer

Why Flink Table SQL API upsert-kafka sink connector doesn't create a log compacted topic?

I'm trying to replicate Flink's upsert-kafka connector example. Using the following input: event_id,user_id,page_id,user_region,viewtime e0,1,11,TR,2022-01-01T13:26:41.298Z e1,1,22,TR,2022-01-02T13:26:41.298Z e2,2,11,AU,2022-02-01T13:26:41.298Z and…
0
votes
2 answers

How to create a DataStreamSource from a Mysql Database?

I have a problem running a flink job that is basically running a query against a mysql database and then tries to create a temporary view that must be accessed from a different job. public static void main(String[] args) throws Exception { …
0
votes
1 answer

Flink fails to load class from JAR added via PipelineOptions

I am developing a Java application which uses UDFs on Flink 1.14. I am using PipelineOptions.JARS config to add jar files containing UDF classes dynamically in the application code, However application fails to load UDF class from configured jar…
Pouria
  • 155
  • 3
  • 12
0
votes
0 answers

no results when applying flink tumble window in flink table api

I have the same problem when I study https://nightlies.apache.org/flink/flink-docs-release-1.14/zh/docs/try-flink/table_api/. When I use tumble window, I got nothing in MySQL sink. return transactions …
王京东
  • 401
  • 4
  • 3
0
votes
2 answers

No results in kafka topic sink when applying tumble window aggregation in Flink Table API

I am using Flink 1.14 deployed by lyft flink operator I am trying to make tumble window aggregate with the Table API, read from the transactions table source, and put the aggregate result by window into a new kafka topic My source is a kafka topic…
0
votes
2 answers

Could not find any factory for identifier 'avro-confluent' that implements 'org.apache.flink.table.factories.DeserializationFormatFactory'

I have a Flink job that runs well locally but fails when I try to flink run the job on cluster. The error happens when trying to load data from Kafka via 'connector' = 'kafka'. I am using Flink-Table API and confluent-avro format for reading data…
Kiran Ashraf
  • 105
  • 1
  • 5
0
votes
0 answers

Apache Flink version 1.13 Convert Table to Dataset?

I am converting some legacy Java code written for Flink version 1.5 to Flink version 1.13.1. Specifically, I'm working with Table API. I have to read data from CSV file, perform some basic SQL and then write results back to a file. For Flink version…
F Baig
  • 339
  • 1
  • 4
  • 13
0
votes
1 answer

Flink Table,Create table Array type error "ValidationException"

I created a flink table that contains data type fields, and the error type does not match。 I want to know how to create a temporary table containing an array type in a flink table. public class FlinkConnectorClickhouse { public static void…
0
votes
2 answers

Flink table exception : Window aggregate can only be defined over a time attribute column, but TIMESTAMP(6) encountered

I am using flink 1.12.0. Trying to convert a data stream into a table A and running the sql query on the tableA to aggregate over a window as below.I am using f2 column as its a timestamp data type field . EnvironmentSettings fsSettings =…
0
votes
1 answer

Converting Flink dynamic table into a Pandas dataframe

I'm using the pyflink table api to read data from Kafka. Now I want to convert the resultant table into a Pandas dataframe. Here is my code, exec_env = StreamExecutionEnvironment.get_execution_environment() exec_env.set_parallelism(1) t_config =…
0
votes
2 answers

pyflink JDBC Postgresql Catalog throwing errors for data type UUID , How to handle the uuid datatype in Flink Table API?

Apache Flink 1.11.0 Python Table API Catalog : postgresql Reading and writing data from postgresql Catalog tables which coantain UUID data type columns through Table API throwing UUID data type unsupportedOperatorException. How to handle the UUID…
bharath
  • 1
  • 1