1

I'm new to Scala and Spark and and started writing a simple Apache Spark program in Scala IDE (in Eclipse). I added the dependency jar files to my project as I usually do in my java project but it can't recognize them and give me the following error message object apache is not a member of package org. How should I add the dependency jar files?

The jar files I'm adding are the ones exist under 'lib' directory where Spark in installed.

Arnon Rotem-Gal-Oz
  • 25,469
  • 3
  • 45
  • 68
HHH
  • 6,085
  • 20
  • 92
  • 164
  • In an Eclipse Scala project, click on the project icon, select properties at the bottom, select Java Build Path, select Add External Jars, select the Jars you want, click on Open, click on Ok and the jars will be in your project's Referenced Libraries folder that is created if necessary and they will be in your project's CLASSPATH. –  Jul 10 '15 at 20:00
  • that's actually what I did and got this error! – HHH Jul 10 '15 at 21:16
  • Spark has a problem of conflicting dependencies. I use Maven projects for it in Eclipse with some success but overall its too heavy for Eclipse on my laptop and I decided to use sbt with Eclipse integration so I can use its editors and then do sbt builds. –  Jul 10 '15 at 21:30
  • @TrisNefzger as I'm new to to spark, sbt and scala, could you elaborate more on that? What are the exact steps? – HHH Jul 10 '15 at 23:58
  • That is too extensive a question for a complete reply. sbt is Scala's default (simple) build tool. Its written in Scala and is available at http://www.scala-sbt.org/. It can be integrated with Eclipse using a plugin that enables an sbt project to be imported into Eclipse. That plugin is available at https://github.com/typesafehub/sbteclipse. The demo spark projects used in the Spark Summit 2014 training use sbt. Resources for this training are at https://spark-summit.org/2014/training. Eclipse is great for Java and ok for simple Scala projects but not for complex ones. –  Jul 11 '15 at 03:28

1 Answers1

2

For scala you use SBT as a dependency manager and code compiler.

More information on how to set it up here:

http://www.scala-sbt.org/release/tutorial/Setup.html

However your build file will look something like this:

name := "Test"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.3.0"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"
ApolloFortyNine
  • 570
  • 2
  • 7
  • Can I do that in Eclipse? – HHH Jul 10 '15 at 19:57
  • 1
    https://github.com/typesafehub/sbteclipse - You must practice your Google-fu. "Eclipse SBT", first link! – Seer Jul 10 '15 at 20:04
  • sbteclipse is a bit manual (you have to run a command every time you change your dependencies). I would recommend using maven instead; eclipse's maven integration is more complete, and it works beautifully with scala. – lmm Jul 13 '15 at 09:31