4

I have a following problem, my main method is:

static public void main(String args[]){
     SparkConf conf = new SparkConf().setAppName("TestHive");
     SparkContext sc = new org.apache.spark.SparkContext(conf);
     HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc);    
}

And I build it with mvn package Then I submit my code, however I get following exception. I have no idea what's wrong:

sh spark-submit --class "TestHive" --master local[4] ~/target/test-1.0-SNAPSHOT-jar-with-dependencies.jar 

Exception in thread "main" java.lang.NoSuchMethodException: org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars, java.util.concurrent.TimeUnit)

Tell me please, where I am wrong.

PS I built my spark with hive and thriftServer.

Spark 1.5.2 built for Hadoop 2.4.0
Build flags: -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
  • when you are doing `mvn package` do you have a "fat jar"? which means a jar that contains all the dependencies?? – user1314742 Mar 13 '16 at 15:36
  • 1
    I couldn't reproduce the error, using exactly the same code as yours. I think it might be the version of spark-hive, would you please list the maven dependency `spark-hive_2.10` ? which version are you using? – user1314742 Mar 19 '16 at 11:07
  • Yes, you are right, It was about hive version. –  Mar 19 '16 at 12:13

1 Answers1

0

It seems to be version conflict between spark components (spark-core, spark-sql and spark-hive)

To avoid this conflit all versions of those components should be the same. You could do that in your pom.xml by setting a peroperty called spark.version for example:

<properties>
    <spark.version>1.6.0</spark.version>
</properties>
<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
</dependencies>
user1314742
  • 2,865
  • 3
  • 28
  • 34