0

Here's my build.sbt file:

name := "CMDW-Security"

version := "19.11.25"

scalaVersion := "2.11.12"

assemblyJarName in assembly := "CMDW-Security.jar"
test in assembly := {}
fullClasspath in assembly := (fullClasspath in Compile).value
assemblyMergeStrategy in assembly := {
    case PathList("META-INF", xs @ _*) => MergeStrategy.discard
    case x => MergeStrategy.first
}

libraryDependencies ++= Seq(
    "org.scalaj" %% "scalaj-http" % "2.4.2",
    "org.scalatest" %% "scalatest" % "3.0.5" % "test",
    "com.typesafe.play" %% "play" % "2.7.3",
    "org.passay" % "passay" % "1.5.0"
)

I built my JAR using sbt clean compile assembly.

While running my file using spark-submit (I have to run it that way despite not having any spark dependency), I see the following exception thrown:

Exception in thread "main" java.lang.NoSuchMethodError: play.api.libs.json.JsLookup$.apply$extension1(Lplay/api/libs/json/JsLookupResult;Ljava/lang/String;)Lplay/api/libs/json/JsValue;
    at com.ldap.restapi.Authorization.getAuthToken(Authorization.scala:45)
    at com.ldap.executable.Test$.main(Test.scala:21)
    at com.ldap.executable.Test.main(Test.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:904)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

When I investigated the JAR by unzipping it using 7zip, I found that play.api.libs.json.JsLookup$.apply$extension1 is indeed not present there. How do I make sure that I get all class files compiled successfully when doing sbt assembly

cchantep
  • 9,118
  • 3
  • 30
  • 41
Sparker0i
  • 1,787
  • 4
  • 35
  • 60
  • 1
    Not an answer but you might have a conflict between sub-dependencies. You'd have to ``shade`` your conflicting packages while using assembly. – baitmbarek Dec 16 '19 at 15:13
  • Have you tried running it just with `java -ja app.jar`? Maybe it is a problem with your `spark-submit`. Also, when one creates a **JAR** for being run with `spark` one usually has to exclude the scala std lib, since **Spark** already includes one. – Luis Miguel Mejía Suárez Dec 16 '19 at 15:17
  • @LuisMiguelMejíaSuárez that did work fine, just `spark-submit` is not working. But that class is indeed missing in the JAR.... – Sparker0i Dec 16 '19 at 15:18
  • Try to add `play-json` separately : https://mvnrepository.com/artifact/com.typesafe.play/play-json_2.12/2.7.3 - I might be wrong, but this lib is not a part of `play` distribution. Alternatively I encourage you to try to use https://github.com/jrudolph/sbt-dependency-graph - plugin to debug your dependency graph, maybe `play-json` was evicted by another transparent dependency with older version. – Ivan Kurchenko Dec 23 '19 at 07:43

0 Answers0