3

I'm an Upstart newbie (and a Spark newbie for that matter),

I've been able to start a spark standalone server using:

./spark-1.5.2-bin-hadoop2.4/sbin/start-master.sh 

and I want this to start automatically every time the computer is turned on, I looked up Upstart and wrote this simple conf file:

 description "satrt a spark master with Upstart"
 author "Ezer"
 exec bash -c '/spark-1.5.2-bin-hadoop2.4/sbin/start-master start'

it does not work and I get the filling I'm missing something basic, any help will be appreciated.

Ezer K
  • 3,637
  • 3
  • 18
  • 34

1 Answers1

1

How about

export SPARK_HOME={YOUR_ACTUAL_SPARK_HOME_PATH}
exec bash $SPARK_HOME/sbin/start-all.sh

in your upstart conf file? However, note the script spawn processes so you cannot actually manage the service with upstart.

shuaiyuancn
  • 2,744
  • 3
  • 24
  • 32