0

I noticed in the current Spark Sql manual that inserting into a dynamic partition is not supported:

Major Hive Features

Spark SQL does not currently support inserting to tables using dynamic partitioning.

However, is insert/overwriting into static partitions supported?

Community
  • 1
  • 1
JeffLL
  • 1,875
  • 3
  • 19
  • 30

2 Answers2

0

Spark SQL does not currently support inserting to tables using dynamic partitioning as of version spark 1.1

Static is supported, you need to write data in hive table location.

Gabe
  • 2,526
  • 1
  • 23
  • 24
  • So an existing Hive query using 'insert overwrite table tablename partion...' would work, right? Sorry don't yet have an environment setup to test. – JeffLL Nov 05 '14 at 19:16
0

According to the release notes, Spark 1.2.0 supports dynamically partitioned inserts. Refer to SPARK-3007.

dnlbrky
  • 9,396
  • 2
  • 51
  • 64