The default
spark-shell --conf spark.hadoop.metastore.catalog.default=hive
val df:Dataframe = ...
df.write.saveAsTable("db.table")
fails as it tries to write a internal / managed / transactional table (see How to write a table to hive from spark without using the warehouse connector in HDP 3.1).
How can I tell spark to not create a managed, but rather an external table?