I am building a SBT multi-project project, which has common
module and logic
module, and logic.dependsOn(common)
.
In common
, SparkSQL 2.2.1 ("org.apache.spark" %% "spark-sql" % "2.2.1") is introduced. In logic
, SparkSQL also is used, however I get compilation errors, saying "object spark is not a member of package org.apache".
Now, if I add SparkSQL dependency to logic
as "org.apache.spark" %% "spark-sql" % "2.2.1"
, it works. However if I add "org.apache.spark" %% "spark-sql" % "2.2.1" % Provided"
, I get the same error.
I don't get why this happened, why not the dependency can't be transitive from common
to logic
here is the root sbt files:
lazy val commonSettings = Seq(
organization := "...",
version := "0.1.0",
scalaVersion := "2.11.12",
resolvers ++= Seq(
clojars,
maven_local,
novus,
twitter,
spark_packages,
artima
),
test in assembly := {},
assemblyMergeStrategy in assembly := {...}
)
lazy val root = (project in file(".")).aggregate(common, logic)
lazy val common = (project in file("common")).settings(commonSettings:_*)
lazy val logic = (project in file("logic")).dependsOn(common).settings(commonSettings:_*)
here is the logic module sbt file:
libraryDependencies ++= Seq(
spark_sql.exclude("io.netty", "netty"),
embedded_elasticsearch % "test",
scalatest % "test"
)
dependencyOverrides ++= Seq(
"com.fasterxml.jackson.core" % "jackson-core" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.6.5",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.5",
"com.fasterxml.jackson.core" % "jackson-annotation" % "2.6.5",
"org.json4s" %% "json4s-jackson" % "3.2.11"
)
assemblyJarName in assembly := "***.jar"