0

hi when i am writing this code

>sbt

And after seeing this result

beyhan@beyhan:~/sparksample$ sbt
Starting sbt: invoke with -help for other options
[info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)

And after i am writing this code

>compile

And i am getting this error

[error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.hadoop#hadoop-yarn-common;1.0.4: not found
    [error] unresolved dependency: org.apache.hadoop#hadoop-yarn-client;1.0.4: not found
    [error] unresolved dependency: org.apache.hadoop#hadoop-yarn-api;1.0.4: not found
    [error] download failed: org.eclipse.jetty.orbit#javax.transaction;1.1.1.v201105210645!javax.transaction.orbit
    [error] download failed: org.eclipse.jetty.orbit#javax.servlet;3.0.0.v201112011016!javax.servlet.orbit
    [error] download failed: org.eclipse.jetty.orbit#javax.mail.glassfish;1.4.1.v201005082020!javax.mail.glassfish.orbit
    [error] download failed: org.eclipse.jetty.orbit#javax.activation;1.1.0.v201105071233!javax.activation.orbit
    [error] Total time: 14 s, completed Oct 16, 2015 3:58:48 PM

My sparksample has this.

beyhan@beyhan:~/sparksample$ ll
total 20
drwxrwxr-x  4 beyhan beyhan 4096 Eki 16 16:02 ./
drwxr-xr-x 57 beyhan beyhan 4096 Eki 16 15:27 ../
drwxrwxr-x  2 beyhan beyhan 4096 Eki 16 16:02 project/
-rw-rw-r--  1 beyhan beyhan  142 Eki 15 18:57 simple.sbt
drwxrwxr-x  3 beyhan beyhan 4096 Eki 15 11:14 src/

Also src file has

src>main>scala>SimpleCode.scala

And my simple.sbt file like this

name := "Spark Sample"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"

What should i do? I think this error about my yarn because i don't have. Thanks.

Beyhan Gul
  • 1,191
  • 1
  • 15
  • 25
  • Your .sbt file should be called build.sbt. I think you are missing a dependency. Try to add libraryDependencies += "org.apache.hadoop" %% "hadoop-yarn-common" % "1.0.4" to this file. – flowit Oct 16 '15 at 13:14
  • why 1.0.4 ? and i think my sbt file is correct because i can connect and also i don't have hadoop-yarn – Beyhan Gul Oct 16 '15 at 13:20
  • Because that's the version from the error message. Obviously, you need hadoop yarn. But this is what sbt solves for you. You just have to list it as dependecy. Btw.: Sbt has a default mode, working even when no build file is present. Just rename it, it's called build.sbt by convention. – flowit Oct 16 '15 at 13:35
  • Duplicate of: https://stackoverflow.com/questions/33143665/spark-sbt-compile-error-librarydependencies – tuxdna Oct 16 '15 at 19:20

1 Answers1

0

This dependency of org.apache.hadoop#hadoop-yarn-client;1.0.4 for you doesn't seem like it is because of build.sbt. Perhaps there is something wrong with your cached files in ~/.ivy2 or ~/.m2, or maybe because of some project/*.sbt files introducing additional dependencies.

It works fine for me though:

build.sbt

$ cat build.sbt 
name := "Spark Sample"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

All dependencies resolved:

$ sbt compile Getting org.scala-sbt sbt 0.13.9 ... :: retrieving :: org.scala-sbt#boot-app confs: [default] 52 artifacts copied, 0 already retrieved (17785kB/791ms) Getting Scala 2.10.5 (for sbt)... :: retrieving :: org.scala-sbt#boot-scala confs: [default] 5 artifacts copied, 0 already retrieved (24493kB/306ms) [info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/) [info] Updating {file:/home/tuxdna/tmp/p/}p... [info] Resolving jline#jline;2.12.1 ... [info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11/1.2.0/spark-core_2.11-1.2.0.jar ... [info] [SUCCESSFUL ] org.apache.spark#spark-core_2.11;1.2.0!spark-core_2.11.jar (31007ms) [info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-common_2.11/1.2.0/spark-network-common_2.11-1.2.0.jar ... [info] [SUCCESSFUL ] org.apache.spark#spark-network-common_2.11;1.2.0!spark-network-common_2.11.jar (1873ms) [info] downloading https://repo1.maven.org/maven2/org/apache/spark/spark-network-shuffle_2.11/1.2.0/spark-network-shuffle_2.11-1.2.0.jar ... [info] [SUCCESSFUL ] org.apache.spark#spark-network-shuffle_2.11;1.2.0!spark-network-shuffle_2.11.jar (2122ms) [info] Done updating. [success] Total time: 61 s, completed 17 Oct, 2015 12:48:49 AM

Note my installed Scala and SBT versions:

$ sbt sbt-version
[info] Set current project to Spark Sample (in build file:/home/tuxdna/tmp/p/)
[info] 0.13.9
$ scala -version
Scala code runner version 2.11.2 -- Copyright 2002-2013, LAMP/EPFL

Could you try these steps using a separate user ( or maybe on a separate machine ) ?

tuxdna
  • 8,257
  • 4
  • 43
  • 61