2

I'm trying to submit a spark job

It starts this way:

import javax.xml.parsers.{SAXParser, SAXParserFactory}

import org.apache.spark
import org.apache.spark.graphx.{Graph, Edge, VertexId}
import org.apache.spark.rdd.{PairRDDFunctions, RDD}
import org.apache.spark.storage.StorageLevel
import org.apache.spark.{SparkContext, SparkConf}
import scala.util.Try
import org.apache.log4j.{Level, Logger}


object MyApp {

  def main(args: Array[String]) {

    val sparkConf = new SparkConf().setAppName("MyApp")
    val sc = new SparkContext(sparkConf)

And when I launch it I get the following error:

App > Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.VolatileObjectRef.zero()Lscala/runtime/VolatileObjectRef;
App > at MyApp$.main(MyApp.scala)
App > at MyApp.main(MyApp.scala)
App > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
App > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
App > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
App > at java.lang.reflect.Method.invoke(Method.java:606)
App > at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
App > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
App > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

What am I doing wrong?

EDIT: Included full stack trace. Using Scala 2.10 and Spark 1.2.0. What's weird is that in my jar, I have two classes. When I spark submit one, it works (it's a 4 lines dummy job), but when I run the longer one (about 40 lines), if fails with the error above

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
Stephane Maarek
  • 5,202
  • 9
  • 46
  • 87
  • 1
    I don't know for this specific case, but for other similar questions, it's been a Spark version conflict. Did you update everything and recompile all your code? – The Archetypal Paul Jan 25 '15 at 19:17
  • 1
    Can you edit your question to include the full stacktrace? Which Spark version are you using and how are you submitting your job (are you using `spark-submit`?)? – Josh Rosen Jan 25 '15 at 19:32
  • Hi I edited my question. I am using spark 1.2.0 and scala 2.10. It's the same server side. I tried cleaning and recompiling, no luck – Stephane Maarek Jan 25 '15 at 19:59
  • 1
    Actually you must be right I think it's 1.1.0 server side. I'll double check that and confirm. Thanks for this highlight! How weird one part of the code works and not the other – Stephane Maarek Jan 25 '15 at 20:06
  • Did you find the offending library? – maasg Jan 26 '15 at 16:04
  • So you were right (that's why I accepted the answer). In my IntelliJ build.SBT it was 2.10 but somehow it wasn't taken into account, as there were some 2.11 remnants library files. So I cleaned the project, removed any dependencies, re imported them and built - everything worked afterwards – Stephane Maarek Jan 26 '15 at 17:15

2 Answers2

11

zero() on scala.runtime.VolatileObjectRef has been introduced in Scala 2.11 You probably have a library compiled against Scala 2.11 and running on a Scala 2.10 runtime.

See

maasg
  • 37,100
  • 11
  • 88
  • 115
  • is that not the other way? Compiled on 2.10 and running against 2.11 so it can't find the zero method? (unsure) – Stephane Maarek Jan 26 '15 at 07:16
  • 1
    no. The `zero()` method did not exist in 2.10. So anything compiled against 2.10 will not know about it, and it wouldn't give a problem on 2.11. – maasg Jan 26 '15 at 07:56
1

Check the artifacts and versions of your Maven dependencies. I got the same error when I used

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.1</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.0.1</version>
</dependency>

I had to change spark-core_2.10 to spark-core_2.11 and the exception disappeared.

Michael Lihs
  • 7,460
  • 17
  • 52
  • 85