3

I'm having trouble with installing Apache Spark on Ubuntu 13.04. Im using spark-0.8.1-incubating, and both ./sbt/sbt update and ./sbt/sbt compile work fine. However, when I do a ./sbt/sbt assembly I get the following error:

[info] Set current project to default-289e76 (in build  file:/node-insights/server/lib/spark-0.8.1-incubating/sbt/)   
[error] Not a valid command: assembly   
[error] Not a valid project ID: assembly   
[error] Not a valid configuration: assembly   
[error] Not a valid key: assembly   
[error] assembly   
[error]            

I googled for stuff related to this but couldn't find anything useful. Any guidance would be much appreciated.

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
deepblue
  • 8,426
  • 13
  • 48
  • 60

2 Answers2

6

The current project set to default-289e76 message suggests that sbt was called from outside of the Spark sources directory:

$  /tmp  ./spark-0.8.1-incubating/sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Set current project to default-d0f036 (in build file:/private/tmp/)
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Not a valid configuration: assembly
[error] Not a valid key: assembly
[error] assembly
[error]         ^

Running ./sbt/sbt assembly works fine from the spark-0.8.1-incubating directory (note the log output showing that the current project was set correctly):

$  spark-0.8.1-incubating  sbt/sbt assembly
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins/project
[info] Loading global plugins from /Users/joshrosen/.dotfiles/.sbt/plugins
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project/project
[info] Loading project definition from /private/tmp/spark-0.8.1-incubating/project
[info] Set current project to root (in build file:/private/tmp/spark-0.8.1-incubating/)
...
Josh Rosen
  • 13,511
  • 6
  • 58
  • 70
4

You typed "abt" twice, but shouldn't that be "sbt"? Apache Spark has its own copy of sbt, so make sure you're running Spark's version to pick up the "assembly" plugin among other customizations.

To run the Spark installation of sbt, go to the Spark directory and run ./sbt/sbt assembly .

swartzrock
  • 729
  • 5
  • 6
  • my bad on the misspelling :)) I typed it up correctly on the console though... but yes, Im using the Spark version of sbt. the "./sbt/sbt assembly" command is the one thats failing. – deepblue Jan 20 '14 at 23:54
  • Ah I see above that you were running from inside the sbt directory. It seems impossible that "./sbt/sbt assembly" would have started the sbt shell. – swartzrock Jan 21 '14 at 00:28