0

I got this error why importing the scalding sbt in my project build.sbt(ref: How to declare dependency on Scalding in sbt project?). Kindly help me out.

lazy val scaldingCore = ProjectRef(uri("https://github.com/twitter/scalding.git"), "scalding-core")
lazy val myProject = project in file(".") dependsOn scaldingCore

Error:Error while importing SBT project:
...
[warn] ==== public: tried [warn]
https://repo1.maven.org/maven2/com/twitter/scalding-core_2.10/0.16.0-SNAPSHOT/scalding-core_2.10-0.16.0-SNAPSHOT.pom [info] Resolving org.scala-lang#scala-compiler;2.10.4 ... [info] Resolving org.scala-lang#scala-reflect;2.10.4 ... [info] Resolving org.scala-lang#jline;2.10.4 ... [info] Resolving org.fusesource.jansi#jansi;1.4 ... [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] ::
UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: com.twitter#scalding-core_2.10;0.16.0-SNAPSHOT: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] [warn] Note: Unresolved dependencies path: [warn] com.twitter:scalding-core_2.10:0.16.0-SNAPSHOT [warn] +- myproject:myproject_2.10:0.1-SNAPSHOT [trace] Stack trace suppressed: run 'last myProject/:update' for the full output. [trace] Stack trace suppressed: run 'last myProject/:ssExtractDependencies' for the full output. [error] (myProject/:update) sbt.ResolveException: unresolved dependency: com.twitter#scalding-core_2.10;0.16.0-SNAPSHOT: not found [error] (myProject/:ssExtractDependencies) sbt.ResolveException: unresolved dependency: com.twitter#scalding-core_2.10;0.16.0-SNAPSHOT: not found

Community
  • 1
  • 1
Taiwotman
  • 885
  • 14
  • 27

1 Answers1

0

Scalding publishes jars on Sonatype Maven Central, so you really shouldn't need to mess with the Git ProjectRef. You just need to get your sbt resolvers correct so that it can find the jars. Start with this in your build.sbt:

resolvers ++= Seq(
  Resolver.sonatypeRepo("releases"),
  "Concurrent Maven Repo" at "http://conjars.org/repo"
)

Cascading publishes to the Conjars repository and not Central, so you most likely need that additional resolver as shown.

Try those initially, and if you're still getting unresolved errors, you may need to add additional repos that Scalding uses depending on which artifacts you need to depend on—you probably do not need the entire scalding uber-artifact, you can likely trim down to scalding-core, scalding-commons, scalding-repl, maybe others depending on the needs of your project.

So to be clear, instead of that ProjectRef and dependsOn scaldingCore, add the above resolvers and something like this:

libraryDependencies ++= {
  val scaldingVersion = "0.16.0-RC6"

  Seq(
    "com.twitter" %% "scalding-core"    % scaldingVersion
  , "com.twitter" %% "scalding-commons" % scaldingVersion
  , "com.twitter" %% "scalding-repl"    % scaldingVersion
  )
}

And so on.

ches
  • 6,382
  • 2
  • 35
  • 32
  • Error:scalac: uncaught exception during compilation: scala.reflect.internal.Types$TypeError Error:scalac: Error: bad symbolic reference. A signature in SchemedSource.class refers to term mapred in package org.apache.hadoop which is not available. It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling SchemedSource.class. .... but then,I got the above while testing with the code as follows: – Taiwotman Apr 19 '16 at 02:30
  • import com.twitter.scalding._ import com.twitter.scalding.ReplImplicits._ import com.twitter.scalding.ReplImplicitContext._ import cascading.pipe.Pipe class product { val data = Tsv("/resources/products.tsv") data.read } – Taiwotman Apr 19 '16 at 02:31
  • Ah, so `scalding-core` [has a dependency on hadoop-client](https://github.com/twitter/scalding/blob/509ec04e6e878f22794891dbb79005174fb40ff3/build.sbt#L324) with "provided" scope—that means it is needed at compile time but is expected to be present in the runtime environment (i.e. when running in Hadoop jobs). You probably need to add your own similar dependency declaration in your project, e.g. `"org.apache.hadoop" % "hadoop-client" % "2.5.0" % "provided"`. – ches Apr 19 '16 at 17:47
  • Gah, they really need to update the Scalding docs for this. This should be a very common workflow and nearly everything on the web and their wiki seems to be outdated and incomplete… You're going to want to build a "fat jar" for deploying your project—[this](https://github.com/Cascading/scalding-tutorial) is a good start but it's also outdated. – ches Apr 19 '16 at 18:01
  • This is a relatively current example project using SBT, you might be able to adapt from its build configuration: https://github.com/deanwampler/scalding-workshop – ches Apr 19 '16 at 18:10
  • I do not agree less for an update of the sbt due to the multi-dependency conflicts. I have an answer that follows, and may be someone might see if the conflict can be resolved. This is actually from the scalding github that is given me some dependency errors. – Taiwotman Apr 20 '16 at 01:15
  • Let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/109608/discussion-between-taiwo-o-adetiloye-and-ches). – Taiwotman Apr 20 '16 at 01:20