I deployed an hadoop cluster with bdutil on google compute engine.
My configuration
- OS: Ubuntu 14
- Spark: 1.5
- Hive: 0.12
- 1 master node and 2 workers
I copied the hive-site.xml from hive to $SPARK_HOME/conf/hive-site.xml (only on master node)
When I tried to use HiveContext in Pyspark shell, I get this message error:
...
Does someone know what is wrong?
Thank you in advance