0

I am new to apache spark and I am trying to start a simple scala spark application. But I am having problem. It says Object apache is not a member of package org.

My application code is:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "YOUR_SPARK_HOME/README.md"
    val conf = new SparkConf().setAppName("Simple Application").setMaster("local")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println(s"Lines with a: $numAs, Lines with b: $numBs")
    sc.stop()
  }
}

My build.sbt file is:

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2"

I ran sbt package from command line and all the required jars are downloaded.

Thanks for help.

katronai
  • 540
  • 1
  • 8
  • 25
  • Possible duplicate of [this](http://stackoverflow.com/questions/33945368/object-spark-is-not-a-member-of-package-org) and [this](http://stackoverflow.com/questions/28269836/scalac-compile-yields-object-apache-is-not-a-member-of-package-org). Have you tried the steps described in those links? Maybe they could be helpful for you. – katronai Jan 12 '17 at 12:18
  • Possible duplicate of [scalac compile yields "object apache is not a member of package org"](http://stackoverflow.com/questions/28269836/scalac-compile-yields-object-apache-is-not-a-member-of-package-org) – eliasah Jan 13 '17 at 15:32

0 Answers0