I'm trying to map the Map[String,String]
object output of my Scala UDF (scala.collection.immutable.map
) to some valid data type in the Table API, namely via Java type (java.util.Map
) as recommended here: Flink Table API & SQL and map types (Scala). However I get below error.
Any idea about right way to proceed ? If yes, is there a way to generalize the conversion to a (nested) Scala object of type Map[String,Any]
?
Code
Scala UDF
class dummyMap() extends ScalarFunction {
def eval() = {
val whatevermap = Map("key1" -> "val1", "key2" -> "val2")
whatevermap.asInstanceOf[java.util.Map[java.lang.String,java.lang.String]]
}
}
Sink
my_sink_ddl = f"""
create table mySink (
output_of_dummyMap_udf MAP<STRING,STRING>
) with (
...
)
"""
Error
Py4JJavaError: An error occurred while calling o430.execute.
: org.apache.flink.table.api.ValidationException: Field types of query result and registered TableSink `default_catalog`.`default_database`.`mySink` do not match.
Query result schema: [output_of_my_scala_udf: GenericType<java.util.Map>]
TableSink schema: [output_of_my_scala_udf: Map<String, String>]
Thanks !