1

I am running spark-job-server 0.5.3 from ooyala. I have followed their official documents and it works fine when it is started by sbt using reStart command. But i can't

  1. make it work using server_start.sh script.

  2. unable to run it on a standalone cluster. its working on local[*] master bydefault now there are no clear doc about how to run the job-server on standalone cluster.

any solution or link to any blog or proper docs is appreciated.

Thanks in advance.

  • Is this still a problem for you? I have worked on a chef cookbook that should set it up with upstart scripts. – meson10 Jan 21 '16 at 02:36

1 Answers1

1

Documentation for the main spark job server project here: github.com/spark-jobserver

  • Copy config/local.sh.template to .sh and edit as
    appropriate. NOTE: be sure to set SPARK_VERSION if you need to
    compile against a different version, ie. 1.4.1 for job server 0.5.2
  • Copy config/shiro.ini.template to shiro.ini and edit as appropriate. NOTE: only required when authentication = on
  • Copy config/local.conf.template to .conf and edit as appropriate. bin/server_deploy.sh -- this packages the job server along with config files and pushes it to the remotes you have configured in .sh
  • On the remote server, start it in the deployed directory with server_start.sh and stop it with server_stop.sh
  • The server_start.sh script uses spark-submit under the hood and may be passed any of the standard extra arguments from spark-submit.

NOTE: by default the assembly jar from job-server-extras, which includes support for SQLContext and HiveContext, is used. If you face issues with all the extra dependencies, consider modifying the install scripts to invoke sbt job-server/assembly instead, which doesn't include the extra dependencies.

Gillespie
  • 2,228
  • 2
  • 18
  • 25
  • Thanks @Gillespie, i've moved the local.conf and local.sh in bin/ because it tries to find every file in bin/. it actually passing a jar from bin to spark-submit in the server_start.sh script, Now i edited the script to use the jar from job-server/ . when i run the server_start.sh now it asks for InstrumentedActor which is in another project so i edited the script further more and added --jars option from spark-submit and passed the akka-app jar. then the server_start.sh script throws java.lang.NoClassDefFoundError: spray/routing/HttpService. Now please help me to understand what am i missing. – Himanshu Mehra Sep 25 '15 at 12:18