1

I want to set the master, spark deploy-mode, driver-class-path and driver-java-options for the Spark job when the job is triggered through Apache Livy without having to restart the Livy server when these settings change. How to do this since there are no direct options to do this in Livy?

1 Answers1

2

Livy doesn't take master and deploy-mode as a param in the REST call. These values will be taken from livy.conf which would look something like this:

livy.spark.master = yarn
livy.spark.deploy-mode = cluster

The above configuration mentions that the master is yarn and the deploy mode is cluster. The spark.driver.extraClassPath and spark.driver.extraJavaOption can be set through livy using the conf param.

An example:

"conf":{ "spark.driver.extraClassPath":"<YOUR_EXTRA_CLASSPATH>",
         "spark.executor.extraJavaOptions":"-Dlog4j.configuration=/app/log4j.properties"}
Sivaprasanna Sethuraman
  • 4,014
  • 5
  • 31
  • 60
  • According to the documentation on this [link](https://spark.apache.org/docs/latest/configuration.html#runtime-environment) , if you set extraClassPath and extraJavaOptions, the JVM has already started and it doesn't make a difference anymore. – Sarthak Singhal May 09 '18 at 10:31
  • ^ I believe they have mentioned it to be for `client` mode – Sivaprasanna Sethuraman May 10 '18 at 04:48
  • 1
    Yes. I want to instantiate the Spark job in client mode through Livy. Need a workaround to set all the mentioned properties for the job. – Sarthak Singhal May 10 '18 at 10:07