0

The exact message is:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SparkSession$implicits$.newSequenceEncoder(Lscala/reflect/api/TypeTags$TypeTag;)Lorg/apache/spark/sql/Encoder;
        at myMainCode$.main(myMainCode.scala:55)
        at myMainCode.main(myMainCode.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:751)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
18/03/13 09:57:41 INFO SparkContext: Invoking stop() from shutdown hook

It was obtain from a code like this:

val df = spark.read.text("<filepath>").
map(x=> UserDefinedJsonParser.parse(x.getString(0)) ).
map(x=> x.chained.map(y=> y)).
flatMap(x=>x).
toDF()

UserDefinedJsonParser is a custom class based on play.json that decodes a json string (x.getString(0)) and returns a case class where one of its parameters "chained" is a List[CustomCaseClass]

It returns an Dataset[List[CustomCaseClass]] that after the flatMap gives Dataset[CustomCaseClass] and after toDF() returns a regular DataFrame.

With those versions:

libraryDependencies += "org.scala-lang" % "scala-library" % "2.11.12"
libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.12"
libraryDependencies += "com.typesafe.play" %% "play-json" % "2.4.11"
libraryDependencies += "org.cvogt" %% "play-json-extensions" % "0.6.1"

I'm getting the error above that seems to be related with the implicits conversions of the spark session.

Same code in spark-shell works like charm.

EDIT:

Found similar but not exact issues on SPARK-17890. It's close but not the same. However the fix is the same. Working with RDD did the trick:

val df = spark.read.text("<filepath>").
map(x=> UserDefinedJsonParser.parse(x.getString(0)) ).
rdd.
map(x=> x.chained.map(y=> y)).
flatMap(x=>x).
toDF()
Luis
  • 159
  • 4
  • 12
  • `NoSuchMethodError` usally indicates incompatible jars. Please add the complete stacktrace for more help – Jens Mar 13 '18 at 09:47
  • Hi jen, i've updated the question with the complete stacktrace. The code submited is a fat jar generated with sbt assembly that includes all dependencies – Luis Mar 13 '18 at 10:01

0 Answers0