0

This qu/answer is the closest to mine, but the solution there does not work for me.

My Flink job runs fine when "sbt built" from Flink dependencies, but fails when I try to submit a assembled fat jar of the same job on an already running Flink cluster (I must eventually submit this way remotely). The error happens when trying to write data using the Table API via 'connector' = 'jdbc' and 'url' = 'jdbc:postgresql://..." and/or driver='org.postgresql.Driver'

(I also tried various combinations of shading, as suggested in other SO posts)

Why can't the complaining calling class in flink.connector.jdbc find the implementation right there in the same package?

org.apache.flink.connector.jdbc.dialect.psql;
/** Factory for {@link PostgresDialect}. */
@Internal
public class PostgresDialectFactory implements JdbcDialectFactory { ...

It happens regardless of whether I use only java or scala (and the corresponding table-api library) (with sbt)

Here is the error trace, which also shows the relevant options fed into Table API, and further below is the sbt file

ERROR org.apache.flink.runtime.webmonitor.handlers.JarRunHandlerException occurred in REST handler: Could not execute application. \
...
**Caused by**: org.apache.flink.table.api.ValidationException: Unable to create a sink for writing table 'default_catalog.default_database.output_table'.
Table options are:
'connector'='jdbc'
'driver'='org.postgresql.Driver'
'password'='******'
'table-name'='spend_report'
'url'='jdbc:postgresql://...'
'username'='qwerty'
    at org.apache.flink.table.factories.FactoryUtil.createDynamicTableSink(FactoryUtil.java:262) ~[flink-table-api-java-uber-1.15.2.jar:1.15.2]
...
**Caused by:** java.lang.IllegalStateException: Could not find any jdbc dialect factories that implement 'org.apache.flink.connector.jdbc.dialect.JdbcDialectFactory' in the classpath.
    at org.apache.flink.connector.jdbc.dialect.JdbcDialectLoader.load(JdbcDialectLoader.java:54) ~[?:?]
    at org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory.validateConfigOptions(JdbcDynamicTableFactory.java:244) ~[?:?]
    at org.apache.flink.connector.jdbc.table.JdbcDynamicTableFactory.createDynamicTableSink(JdbcDynamicTableFactory.java:85) ~[?:?]
    at org.apache.flink.table.factories.FactoryUtil.createDynamicTableSink(FactoryUtil.java:259) ~[flink-table-api-java-uber-1.15.2.jar:1.15.2]
    at org.apache.flink.table.planner.delegation.PlannerBase.getTableSink(PlannerBase.scala:434) ~[?:?]
    at org.apache.flink.table.planner.delegation.PlannerBase.translateToRel(PlannerBase.scala:227) ~[?:?]
    at org.apache.flink.table.planner.delegation.PlannerBase.$anonfun$translate$1(PlannerBase.scala:185) ~[?:?]
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:233) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.Iterator.foreach(Iterator.scala:937) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.Iterator.foreach$(Iterator.scala:937) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1425) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.IterableLike.foreach(IterableLike.scala:70) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.IterableLike.foreach$(IterableLike.scala:69) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.TraversableLike.map(TraversableLike.scala:233) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.TraversableLike.map$(TraversableLike.scala:226) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at scala.collection.AbstractTraversable.map(Traversable.scala:104) ~[flink-scala_2.12-1.15.2.jar:1.15.2]
    at org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:185) ~[?:?]
    at org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1656) ~[flink-table-api-java-uber-1.15.2.jar:1.15.2]

build.sbt:

    javacOptions in (Compile, compile) ++= Seq("-source", "11", "-target", "11", "-g:lines")
    
    crossPaths := false // drop off Scala suffix from artifact names.
    autoScalaLibrary := false // exclude scala-library from dependencies
    
    libraryDependencies += "org.apache.flink" % "flink-table-api-java" % "1.15.2" % "provided"
    libraryDependencies += "org.apache.flink" % "flink-table-planner-loader" % "1.15.2" % "provided"
    libraryDependencies += "org.apache.flink" % "flink-connector-jdbc" % "1.15.2"
    
    // libraryDependencies += "org.apache.flink" % "flink-runtime" % "1.15.2" % "provided" // for sbt run, which works
    // libraryDependencies += "org.apache.flink" % "flink-table-runtime" % "1.15.2" % "provided" // for sbt run
    
    // this works for csv
    libraryDependencies += "org.apache.flink" % "flink-connector-files" % "1.15.2"
    libraryDependencies += "org.apache.flink" % "flink-csv" % "1.15.2"
    libraryDependencies += "org.apache.flink" % "flink-clients" % "1.15.2"
    
    // JDBC driver for PostgreSQL
    libraryDependencies += "org.postgresql" % "postgresql" % "42.6.0"
    
    assembly / assemblyMergeStrategy := {
      case "META-INF/services/org.apache.flink.table.factories.Factory" => MergeStrategy.concat
      case "META-INF/services/org.apache.flink.table.factories.TableFactory" => MergeStrategy.concat
      case PathList("META-INF", xs @ _*) => MergeStrategy.discard
      case x => MergeStrategy.first
    }

Contents from Flink Docker image which is the running cluster:

@jobmanager:/opt/flink/lib# ls -alh
2022 flink-cep-1.15.2.jar
-rw-r--r-- 1 flink flink 475K Aug 17  2022 flink-connector-files-1.15.2.jar
-rw-r--r-- 1 root  root  386K Mar 31 13:04 flink-connector-kafka_2.12-1.15.2.jar
-rw-r--r-- 1 flink flink  93K Aug 17  2022 flink-csv-1.15.2.jar
-rw-r--r-- 1 flink flink 111M Aug 17  2022 flink-dist-1.15.2.jar
-rw-r--r-- 1 flink flink 172K Aug 17  2022 flink-json-1.15.2.jar
-rw-r--r-- 1 flink flink  21M Aug 17  2022 flink-scala_2.12-1.15.2.jar
-rw-r--r-- 1 flink flink  11M Feb  8  2022 flink-shaded-zookeeper-3.5.9.jar
-rw-r--r-- 1 root  root  208K Mar 31 13:04 flink-streaming-scala_2.12-1.15.2.jar
-rw-r--r-- 1 flink flink  15M Aug 17  2022 flink-table-api-java-uber-1.15.2.jar
-rw-r--r-- 1 flink flink  35M Aug 17  2022 flink-table-planner-loader-1.15.2.jar
-rw-r--r-- 1 flink flink 2.9M Aug 17  2022 flink-table-runtime-1.15.2.jar
-rw-r--r-- 1 root  root  4.5M Mar 31 13:05 kafka-clients-2.8.2.jar
-rw-r--r-- 1 flink flink 204K Jan  4  2022 log4j-1.2-api-2.17.1.jar
-rw-r--r-- 1 flink flink 295K Jan  4  2022 log4j-api-2.17.1.jar
-rw-r--r-- 1 flink flink 1.8M Jan  4  2022 log4j-core-2.17.1.jar
-rw-r--r-- 1 flink flink  24K Jan  4  2022 log4j-slf4j-impl-2.17.1.jar
dcl04
  • 89
  • 8

1 Answers1

0

The content of your lib folder doesn't show the Postgres JDBC driver. You need to make sure that the JDBC driver also ends up in the lib folder.

Martijn Visser
  • 1,468
  • 1
  • 3
  • 9
  • Hi, I don't have access to the `lib` folder (on the remote cluster). How can I make sure it gets there? Am I missing a step referred to in the [classloading docs](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/ops/debugging/debugging_classloading/) ? – dcl04 Apr 20 '23 at 11:05