7

I am trying to add a library dependency through an sbt plugin. The dependency should be added to each sub-project per its binary scala version, so I iterate through each subproject.

  private def inject(): State => State = { state =>
    val extracted: Extracted = Project.extract(state)

    val enrichedLibDepSettings = extracted.structure.allProjectRefs map { projRef =>

      val projectScalaVersion = (scalaBinaryVersion in projRef)

      libraryDependencies in projRef += 
        compilerPluginOrg % (compilerPluginArtifact + "_" + projectScalaVersion.value) % compilerPluginVersion % "provided"
    }

    val newState = extracted.append(enrichedLibDepSettings, state)

    val updateAfterLibAppend = extracted.structure.allProjectRefs map { projRef => 
      println("running update: " + EvaluateTask(extracted.structure, update, newState, projRef)) }
      state
  }

However this is not working - the printed output shows no trace of a library dependency being appended through libraryDependencies in projRef +=, nor is any error issued, leaving subsequent steps to fail over the missing dependency. What might be wrong with this technique?

You will ask why is this needed in the first place? why add a library dependency through an sbt plugin like that?

Although we have in sbt addCompilerPlugin, it cannot be used for compiler plugins that have arguments (-Xplugin with a path to a jar must be specified to scalac, for it to accept compiler plugin arguments, as far as experimentation shows). Hence we need to inject the compiler plugin via -Xplugin after having it resolved as a library dependency (then fiddle its file path location inspecting the result of update). Hence we do need to add a library dependency via an sbt plugin. And we further need to do this per sub-project, as a multi-project build may house sub-projects of varying scala versions - each one must have a binary compatible compiler plugin injected, in order to maintain binary compatibility.

By the way, and this might illuminate something I'm in the dark over: When adding the library dependency in a projectSettings override for the root project - as below - the dependency seems to resolve, but that is useless, as it will apply the same binary version to all sub-projects, which is against the nature of the task at hand (some sub-projects will naturally crash over binary incompatibility). Also I think it will override the root's settings whereas the goal here is to append a setting not to override existing settings.

object Plugin extends AutoPlugin {
  override lazy val projectSettings = Seq(
    ...
}

A pair of clues?

  1. Appending scalacOptions per sub-project - using the same technique - simply works.

  2. Applying += to libraryDepenencies above, does not even affect the output of inspect libraryDependencies, unlike when using the same idiom inside a an override lazy val projectSettings block of an AutoPlugin.

matanster
  • 15,072
  • 19
  • 88
  • 167

1 Answers1

4

I think you might be confused about what projectSettings is. If you extend AutoPlugin, you can define the default settings that are applied (on top of the defaults) for each project, see https://github.com/sbt/sbt/blob/v0.13.9/main/src/main/scala/sbt/Plugins.scala#L81

This means you could simply add your artefact here using the typical Setting / Task notation, e.g.

def projectSettings = Seq(
  libraryDependencies += {
    val bin = scalaBinaryVersion.value
    ...
  }
)

note that this is +=, not :=. Does that help?

fommil
  • 5,757
  • 8
  • 41
  • 81
  • 1
    Indeed I have wrongly assumed what `projectSettings` does and pointing this out helps. Unfortunately yet, I still would like each sub-project to get its own appended settings - so that a project with sub-projects of varying scala versions will get the corresponding binary version of the plugin _per sub-project_. To that end I have observed that something like `libraryDependencies += compilerPluginOrg % (compilerPluginArtifact + "_" + scalaBinaryVersion.value) % compilerPluginVersion` plugs in the same version to all projects, counter to that end. Hence the more involved scheme. – matanster Dec 14 '15 at 04:18
  • Then that means your `scalaBinaryVersion` is being set after the plugin loads. Try putting your config into a val that is listed after your custom settings. I frankly don't know how a mixed version scala project would ever work. The standard approach is to cross build, either using sbt's over engineered solution or by sbt-extra's provision of `++SCALA_VERSION` – fommil Dec 14 '15 at 20:14
  • It's not really only about cross building scenarios per-se: for example looking at parboiled2's build, the scala version is specified as 2.11 in all sub-projects, whereas the overall project has no version specified (hence 2.10). Other than excusing such a build as pathological, which I would probably not, I believe this requires bringing in the right version of the compiler plugin per sub-project. – matanster Dec 15 '15 at 11:31
  • I might not be following in what sense would something like `scalaBinaryVersion` be set after an sbt plugin loads v.s. before.. enumerating all possible meanings, I have probably misinterpreted there altogether. – matanster Dec 15 '15 at 11:34