1

I'm new to Spark and downloaded a pre-compiled Spark binaries from Apache (Spark-2.1.0-bin-hadoop2.7)

When submitting my scala (2.11.8) uber jar the cluster throw and error:

java.lang.IllegalStateException: Library directory '/root/spark/assembly/target/scala-2.10/jars' does not exist; make sure Spark is built

I'm not running Scala 2.10 and Spark isn't compiled (as much as I know) with Scala 2.10

Could it be that one of my dependencies is based on Scala 2.10 ?
Any suggestions what can be wrong ?

himanshuIIITian
  • 5,985
  • 6
  • 50
  • 70
Y. Eliash
  • 1,808
  • 3
  • 14
  • 23
  • Can you share the list of dependencies that you are packaging in your uber jar? – himanshuIIITian Jul 25 '17 at 04:43
  • @himanshuIIITian `"org.scalatest" %% "scalatest" % "3.0.1", "org.scalaj" %% "scalaj-http" % "2.3.0", "org.apache.spark" %% "spark-core" % "2.2.0" % "provided", "org.apache.spark" %% "spark-sql" % "2.2.0" % "provided", "org.apache.spark" %% "spark-yarn" % "2.2.0", "org.apache.hadoop" % "hadoop-client" % "2.8.1", "org.apache.hadoop" % "hadoop-yarn-client" % "2.8.1", "org.apache.hive" % "hive-jdbc" % "2.3.0"` – Y. Eliash Jul 25 '17 at 06:39

2 Answers2

0

Note sure what is wrong with the pre-built spark-2.1.0 but I've just downloaded spark 2.2.0 and it is working great.

Y. Eliash
  • 1,808
  • 3
  • 14
  • 23
0

Try setting SPARK_HOME="location to your spark installation" on your system or IDE

wbmrcb
  • 358
  • 2
  • 5