0

I used spark 1.6.2 and Scala 11.8 to compile my project. The generated uber jar with dependencies is placed inside Spark Job Server (that seems to use Scala 10.4 (SCALA_VERSION=2.10.4 specified in .sh file)

There is no problem in starting the server, uploading context/ app jars. But at runtime, the following errors occur

java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror

Why do Scala 2.11 and Spark with scallop lead to "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror"? talks about using Scala 10 to compile the sources. Is it true?

Any suggestions please...

Community
  • 1
  • 1
user1384205
  • 1,231
  • 3
  • 20
  • 39

1 Answers1

0

Use scala 2.10.4 to compile your project. Otherwise you need to compile spark with 11 too.

noorul
  • 1,283
  • 1
  • 8
  • 18
  • That indeed works. Thanks. Can you please explain why? Most seem to folks agree on using 11.* – user1384205 Oct 03 '16 at 16:48
  • You have to compile spark also using scala 11 – noorul Oct 03 '16 at 16:59
  • ok. I run a maven project to download the spark dependencies and eclipse to compile my project along with the maven dependencies. At this point, SJS jar is also added to the maven repo. Now, changing the scala compiler version from 10 to 11 should suffice? Thats precisely what I was doing. Am I missing something? – user1384205 Oct 04 '16 at 02:35
  • No, I meant the spark binary. In your deployment environment there will be spark binary. Those should be compiled with scala 11. – noorul Oct 04 '16 at 04:02