17

I begin to test spark. I installed spark on my local machine and run a local cluster with a single worker. when I tried to execute my job from my IDE by setting the sparconf as follows:

final SparkConf conf = new SparkConf().setAppName("testSparkfromJava").setMaster("spark://XXXXXXXXXX:7077");
final JavaSparkContext sc = new JavaSparkContext(conf);
final JavaRDD<String> distFile = sc.textFile(Paths.get("").toAbsolutePath().toString() + "dataSpark/datastores.json");*

I got this exception:

java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -5447855329526097695, local class serialVersionUID = -2221986757032131007
KompjoeFriek
  • 3,572
  • 1
  • 22
  • 35

5 Answers5

6

It can be multiple incompatible reasons below:

  • Hadoop version;
  • Spark version;
  • Scala version;
  • ...

For me, its Scala version , I using 2.11.X in my IDE but official doc says:

Spark runs on Java 7+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.6.1 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).

and the x in the doc told cannot be smaller than 3 if you using latest Java(1.8), cause this. Hope it will help you!

Kun
  • 61
  • 1
  • 5
  • Can confirm! In my case I have to update my Hadoop version to 3.2.2 and Spark to 3.1.1 and all work correctly! – Genarito Mar 03 '21 at 17:40
6

Got it all working with below combination of versions

Installed spark 1.6.2

verify with bin/spark-submit --version

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.6.2</version>
</dependency>

and

Scala 2.10.6 and Java 8.

Note it did NOT work and have similar class incompatible issue with below versions

Scala 2.11.8 and Java 8

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>1.6.2</version>
</dependency>
user1733158
  • 61
  • 1
  • 4
5

Looks your installed Spark version is not same as the Spark version used in your IDE.

If you are using maven, just compare the version of the dependency declared in pom.xml and the output of bin/spark-submit --version and make sure they are same.

zsxwing
  • 20,270
  • 4
  • 37
  • 59
  • Yes I have already checked the version. 2.6.0 for both. The problems remain maybe in Scala-version. the binary built version on the spark download page is built with scala2.10. I am investigating ... – Nesrine Ben mustapha Feb 19 '16 at 11:38
  • 1
    `2.6.0` or `1.6.0`? So you are using Scala 2.11? Switch to 2.10 is the easiest way if you don't want to build Spark by yourself. – zsxwing Feb 19 '16 at 18:49
  • Sorry yes I mean 1.6.0. Yes I switch to 2.10 and it is definitely better. Thanks – Nesrine Ben mustapha Feb 23 '16 at 09:30
0

I faced this issue because Spark jar dependency was 2.1.0 but installed Spark Engine version is 2.0.0 Hence version mismatch, So it throws this exception.

The root cause of this problem is version mismatch of Spark jar dependency in project and installed Spark Engine where execute spark job is running.

Hence verify both versions and make them identical.

Example Spark-core Jar version 2.1.0 and Spark Computation Engine version must be: 2.1.0

Spark-core Jar version 2.0.0 and Spark Computation Engine version must be: 2.0.0

It's working for me perfectly.

Eric Aya
  • 69,473
  • 35
  • 181
  • 253
Rajeev Rathor
  • 1,830
  • 25
  • 20
0

I had this problem.

when I run the code with spark-submit it works (instead of running with IDE).

 ./bin/spark-submit --master spark://HOST:PORT target/APP-NAME.jar