1

How can I build spark with current (hive 2.1) bindings instead of 1.2? http://spark.apache.org/docs/latest/building-spark.html#building-with-hive-and-jdbc-support Does not mention how this works. Does spark work well with hive 2.x?

Georg Heiler
  • 16,916
  • 36
  • 162
  • 292

1 Answers1

2

I had the same question and this is what I've found so far. You can try to build spark with the newer version of hive:

mvn -Dhive.group=org.apache.hive -Dhive.version=2.1.0 clean package

This runs for a long time and fails in unit tests. If you skip tests, you get a bit farther but then run into compilation errors. In summary, spark does not work well with hive 2.x!

I also searched through the ASF Jira for Spark and Hive and haven't found any mentions of upgrading. This is the closest ticket I was able to find: https://issues.apache.org/jira/browse/SPARK-15691

Dave Handy
  • 21
  • 2