0

I'm facing this exception with spark-sql over hive tables. This happens specifically when my query has both WITH and INSERT clauses, i.e., it works if I remove the WITH clause or if I replace INSERT:

sql("WITH... SELECT...").write.parquet("/test/")

I'm using spark 1.6.0 on CDH 5.7 and spark 1.6.1 on HD Insight.

Any ideas?

The same exception has been related here but for other reasons. A ticket exists related to this exception here.

Community
  • 1
  • 1
Fernando Lemos
  • 297
  • 3
  • 9

1 Answers1

1

Spark 1.6.0 does not support common table expression(CTE) when you try to invoke sqlcontext(which is actually hive context), This is why you are getting this error.Please rewrite your CTE to a regular query.

vigilant
  • 41
  • 8