I am working with PySpark and I want to insert an array of strings into my database that has a JDBC driver but I am getting the following error:
IllegalArgumentException: Can't get JDBC type for array<string>
This error happens when I have an ArrayType(StringType()) format for a UDF. And when I try to overwrite the column type:
.option("createTableColumnTypes", "col1 ARRAY, col2 ARRAY, col3 ARRAY, col4 ARRAY")
I get:
DataType array is not supported.(line 1, pos 18)
This makes me wonder if the problem is within Spark 3.1.2 where there is no mapping for array and I have to convert it into a string or is it coming from the driver that I am using?
For reference, I am using CrateDB as database. And here is its driver: crate.io/docs/jdbc/en/latest