10

I need to exclude spark and test dependencies from my final assembly jar. I tried to use provider but it was not working.

libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.11" % "2.0.1" % "provided")

and execute sbt assembly.

Please help me resolve this issue.

lospejos
  • 1,976
  • 3
  • 19
  • 35
John
  • 1,531
  • 6
  • 18
  • 30

2 Answers2

13

Use exclude option of assembly plugin filtering by direct name or with contains

assemblyExcludedJars in assembly := {
    val cp = (fullClasspath in assembly).value
    cp filter { f =>
      f.data.getName.contains("spark-core") ||
      f.data.getName == "spark-core_2.11-2.0.1.jar"
    }
  }
FaigB
  • 2,271
  • 1
  • 13
  • 22
-2

I don't think || works. Instead use:

assemblyExcludedJars in assembly := {

var cp = (fullClasspath in assembly).value

cp     = cp filter { f=>f.data.getName.contains("spark-core")}

cp     = cp filter { f=>f.data.getName.contains("spark-core_2.11-2.0.1.jar")

 }

}
pushkin
  • 9,575
  • 15
  • 51
  • 95