0

I am trying to build a single spark scala job. So far as I know, it should be done using <sbt assembly> command in the spark directory, however using this method will build all jobs existing there. Is there a way to build a single specific file at a time ?

1 Answers1

0

You'll need to split your project into multiple SBT subprojects (it's possible that you already do that). Then you can use SBT multi-project build.

If you do so, you can navigate to any subproject and build it separately. Note that you can build a separate JAR only if it's built per subproject, not in the aggregate root project. If you want to build non-interactively you can pass the same commands to SBT as you would type in SBT shell:

sbt "project client" "package"

Replace package command with any other packaging command you use like assembly or one-jar, etc.

For an example project check this. The project structure is as follows (SBT shell):

root > projects
[info]     client
[info]     common
[info]   * root
[info]     server

Projects are defined like this:

lazy val root = Project(id = "root",
                        base = file("."))
                        .aggregate(common, client, server)

lazy val client = Project(id = "client", base = file("client"))
                         settings(clientSettings: _*) dependsOn(common)

lazy val server = Project(id = "server", base = file("server"))
                         settings(serverSettings: _*) dependsOn(common)

lazy val common = Project(id = "common", base = file("common"))
                         settings(commonSettings: _*)

As you can see there is a root project that aggregates 3 other projects. To build client or server you have to build common first. However, client can be built separately from server and vice-versa. To build everything you just build root.

yǝsʞǝla
  • 16,272
  • 2
  • 44
  • 65