18

Any reason why I get this error ? Initially the IDE plugin for Scala was 2.12.3. But since I'm working with Spark 2.2.0, I manually changed it to Scala 2.11.11.

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/09/19 12:08:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
    at scala.xml.Null$.<init>(Null.scala:23)
    at scala.xml.Null$.<clinit>(Null.scala)
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:84)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:221)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:163)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
    at sparkEnvironment$.<init>(Ticket.scala:33)
    at sparkEnvironment$.<clinit>(Ticket.scala)
    at Ticket$.main(Ticket.scala:39)
    at Ticket.main(Ticket.scala)
TheShark
  • 420
  • 3
  • 6
  • 17

3 Answers3

40

Make sure Spark is compatible with corresponding Scala version

The error is common when using Scala version 2.12 series with any version of Spark offering Scala 2.11.

You can try using the 2.11 series of Scala with Spark . i.e.

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"

As you can see in this dependency spark-core_2.11 is associated with scala version 2.11.

That's why it's safer (more compatible) to use %% and avoid hardcoding the version of Scala in Spark dependencies. Let the tool resolve the required Scala version for you automatically as follows:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"

The above declaration will automatically infer the scala version.

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
Akash Sethi
  • 2,284
  • 1
  • 20
  • 40
  • 1
    Thanks for the tip ! Yes I did do the changes and initially didn't work out as well. It was later identified that Scala 2.12 libraries were conflicting in priority with Scala 2.11.So, I had start with a new project only with Scala 2.11, having removed all Scala 2.12 stuff. – TheShark Sep 19 '17 at 14:43
  • thanks. I changed to 2.11.7 to fix this issue, but why the 2.12 can not work on spark?thanks – liam xu May 21 '19 at 13:58
  • Because Spark 2.2.0 version was build using Scala 2.11.x plus some internal guuf ups. – Akash Sethi May 21 '19 at 14:01
  • 1
    I would like to point out that not only the spark version for compilation must be compatible but also the execution binary vers. E.g if you compile with spark 3.0 and scala 2.12, the spark-submit version must be of version 3.0 also – RaphaëlR Dec 03 '20 at 19:49
  • it works for me, I was trying to run spark 3 with scala 2.12 however my environment was build using spark 2. So I changed the spark version an execute again. – Sarang Jul 21 '22 at 10:54
0

Scala Version 2.12 gave me the similar error, however after removing the extends App and adding the main method, everything is working fine.

object SparkDemo extends App{}

Just remove extends App and add main function

object SparkDemo {
def main(args: Array[String]): Unit = {}
}

Spark Document itself even suggests using main function rather than extends App.

https://spark.apache.org/docs/2.4.0/quick-start.html#:~:text=Note%20that%20applications%20should%20define%20a%20main()%20method%20instead%20of%20extending%20scala.App.%20Subclasses%20of%20scala.App%20may%20not%20work%20correctly.

This problem could be caused by any dependency as well; If a dependency was internally compiled with Scala 2.11, you will need to downgrade your version of Scala.

0

Its a spark dependency issue. Scala 2.12 compatible with spark 3 & scala 2.11 compatible with spark 2

Sarang
  • 422
  • 5
  • 11