Is it possible to inject parameter into spark JDBC insert statement?
I'm using
spark.sql("select * from my_table ").write.mode(SaveMode.Append).jdbc
for saving bulk dataframe to my DB.
In JdbcUtils
the insertStatement
is created.
Is it possible ( without creating my own JDBC connections and statements) to add a parameter like "abort on error" into the statement?