1

Is there any specific Spark Job Server version matching with Spark 1.6.0 ? As per the version information in https://github.com/spark-jobserver/spark-jobserver, I see SJS is available only for 1.6.1 not for 1.6.0. Our CloudEra hosted Spark is running on 1.6.0

I deployed SJS by configuring the spark home to 1.6.1. When I submitted jobs, I see job ids are getting generated but I can't see the job result. Any inputs?

Shakeel
  • 31
  • 1
  • 3

1 Answers1

0

No, there is no SJS version tied to spark 1.6.0. But it should be easy for you to compile against 1.6.0. May be you could modify this https://github.com/spark-jobserver/spark-jobserver/blob/master/project/Versions.scala#L10 and try.

noorul
  • 1,283
  • 1
  • 8
  • 18
  • Thanks Noorul. SJS bin folder has server_deploy.sh and server_package.sh. If I'm compiling from the machine where spark is hosted (not planning to compile on Server A and then ssh to Server B), is it OK to user server_package.sh? It appears that this(server_package.sh) script too compiling the SJS like server_deploy.sh except ftping. – Shakeel Jul 11 '16 at 12:58
  • I think it should fine to run server_deploy.sh from the same machine as deploy server. – noorul Jul 11 '16 at 17:01