3

I am currently working on a project to extend the functionalities of SQL to support more stream computing features based on Apache Flink.

After doing extensive search, I found that Calcite is a great tool to help me parse, validate and optimize those SQL queries, but the streaming support of Calcite is still immature so I have to improve it to suit my needs.

Hence, I would like to know if there is a way to add custom clauses like

CREATE TABLE my_table (
    id   bigint,
    user varchar(20)
) PARAMS (
    connector 'kafka',
    topic     'my_topic'
)

which uses PARAMS to define how to receive data from a Kafka connector and treat it like a dynamic table as a data source to Flink.

Since there is so little information about this, I would greatly appreciate it if someone of you could provide some hints.

Thank you : )

Kyle Dong
  • 33
  • 3

1 Answers1

4

Until the latest release (1.15.0, 11th Dec. 2017), Apache Calcite did not support DDL statements, such as CREATE TABLE or DROP TABLE. The reason was that

SELECT and DML are standardized, but DDL tends to be database-specific, so our policy is that you make DDL extensions outside of Calcite.

(see Calcite dev mailing list).

With Calcite 1.15.0, the community added basic support for DDL statements. The feature was implemented as an optional module and shows how to customize DDL statements (see documentation). So, it is still expected that systems that use Calcite customize the parser and DDL syntax to their needs.

Fabian Hueske
  • 18,707
  • 2
  • 44
  • 49
  • I looked at the calcite-server code and was not able to figure out if there is a way to get the SQL DDL Query from a Schema. Just to give you more context, I have created a schema and tables using some configuration file. Now, I was wondering if I can get the DDL for SQL table creation somehow. – AKG Jun 21 '18 at 18:50