I'm facing this exception with spark-sql over hive tables. This happens specifically when my query has both WITH and INSERT clauses, i.e., it works if I remove the WITH clause or if I replace INSERT:
sql("WITH... SELECT...").write.parquet("/test/")
I'm using spark 1.6.0 on CDH 5.7 and spark 1.6.1 on HD Insight.
Any ideas?
The same exception has been related here but for other reasons. A ticket exists related to this exception here.