2

I want to execute a query on Flink SQL Table backed by kafka topic of secured kafka cluster. I'm able to execute the query programmatically but unable to do the same through Flink SQL client. I'm not sure on how to pass JAAS config (java.security.auth.login.config) and other system properties through Flink SQL client.

Flink SQL query programmatically

 private static void simpleExec_auth() {

        // Create the execution environment.
        final EnvironmentSettings settings = EnvironmentSettings.newInstance()
                .inStreamingMode()
                .withBuiltInCatalogName(
                        "default_catalog")
                .withBuiltInDatabaseName(
                        "default_database")
                .build();

        System.setProperty("java.security.auth.login.config","client_jaas.conf");
        System.setProperty("sun.security.jgss.native", "true");
        System.setProperty("sun.security.jgss.lib", "/usr/libexec/libgsswrap.so");
        System.setProperty("javax.security.auth.useSubjectCredsOnly","false");

        TableEnvironment tableEnvironment = TableEnvironment.create(settings);
        String createQuery = "CREATE TABLE  test_flink11 ( " + "`keyid` STRING, " + "`id` STRING, "
                + "`name` STRING, " + "`age` INT, " + "`color` STRING, " + "`rowtime` TIMESTAMP(3) METADATA FROM 'timestamp', " + "`proctime` AS PROCTIME(), " + "`address` STRING) " + "WITH ( "
                + "'connector' = 'kafka', "
                + "'topic' = 'test_flink10', "
                + "'scan.startup.mode' = 'latest-offset', "
                + "'properties.bootstrap.servers' = 'kafka01.nyc.com:9092', "
                + "'value.format' = 'avro-confluent', "
                + "'key.format' = 'avro-confluent', "
                + "'key.fields' = 'keyid', "
                + "'value.fields-include' = 'EXCEPT_KEY', "
                + "'properties.security.protocol' = 'SASL_PLAINTEXT', 'properties.sasl.kerberos.service.name' = 'kafka', 'properties.sasl.kerberos.kinit.cmd' = '/usr/local/bin/skinit --quiet', 'properties.sasl.mechanism' = 'GSSAPI', "
                + "'key.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', "
                + "'key.avro-confluent.schema-registry.subject' = 'test_flink6', "
                + "'value.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', "
                + "'value.avro-confluent.schema-registry.subject' = 'test_flink4')";
        System.out.println(createQuery);
        tableEnvironment.executeSql(createQuery);
        TableResult result = tableEnvironment
                .executeSql("SELECT name,rowtime FROM test_flink11");
        result.print();
    }

This is working fine.

Flink SQL query through SQL client

Running this giving the following error.

Flink SQL> CREATE TABLE test_flink11 (`keyid` STRING,`id` STRING,`name` STRING,`address` STRING,`age` INT,`color` STRING) WITH('connector' = 'kafka', 'topic' = 'test_flink10','scan.startup.mode' = 'earliest-offset','properties.bootstrap.servers' = 'kafka01.nyc.com:9092','value.format' = 'avro-confluent','key.format' = 'avro-confluent','key.fields' = 'keyid', 'value.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', 'value.avro-confluent.schema-registry.subject' = 'test_flink4', 'value.fields-include' = 'EXCEPT_KEY', 'key.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', 'key.avro-confluent.schema-registry.subject' = 'test_flink6', 'properties.security.protocol' = 'SASL_PLAINTEXT', 'properties.sasl.kerberos.service.name' = 'kafka', 'properties.sasl.kerberos.kinit.cmd' = '/usr/local/bin/skinit --quiet', 'properties.sasl.mechanism' = 'GSSAPI');

Flink SQL> select * from test_flink11;
[ERROR] Could not execute SQL statement. Reason:
java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is /tmp/jaas-6309821891889949793.conf

There is nothing in /tmp/jaas-6309821891889949793.conf except the following comment

# We are using this file as an workaround for the Kafka and ZK SASL implementation
# since they explicitly look for java.security.auth.login.config property
# Please do not edit/delete this file - See FLINK-3929

SQL client run command

bin/sql-client.sh embedded --jar  flink-sql-connector-kafka_2.11-1.12.0.jar  --jar flink-sql-avro-confluent-registry-1.12.0.jar

Flink cluster command

bin/start-cluster.sh

How to pass this java.security.auth.login.config and other system properties (that I'm setting in the above java code snippet), for SQL client?

Markiv
  • 317
  • 1
  • 5
  • 13

1 Answers1

0

flink-conf.yaml

security.kerberos.login.use-ticket-cache: true
security.kerberos.login.principal: XXXXX@HADOOP.COM
security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: /path/to/kafka.keytab
security.kerberos.login.principal: XXXX@HADOOP.COM
security.kerberos.login.contexts: Client,KafkaClient

I haven't really tested whether this solution is feasible, you can try it out, hope it will help you.

liliwei
  • 294
  • 1
  • 8