0
def main(args: Array[String]) {
  val conf = new SparkConf().setMaster("local[4]").setAppName("LongPiJob")
  val sc = new SparkContext(conf)
  val env = new JobEnvironment {
    def jobId: String = "abcdef"
    //scalastyle:off
    def namedObjects: NamedObjects = ???
    def contextConfig: Config = ConfigFactory.empty
  }
  val results = runJob(sc, env, 5)
  println("Result is " + results)
}

I took this code from the longpi example for spark jobserver relating to the new api which is part of the github repo. I don't understand what new JobEnvironment or any of the variables inside it. My IDE is complains with these default settings.

https://github.com/spark-jobserver/spark-jobserver/blob/spark-2.0-preview/job-server-tests/src/main/scala/spark/jobserver/LongPiJob.scala

ozzieisaacs
  • 833
  • 2
  • 11
  • 23

1 Answers1

1

JobEnvironment has runtime information about the Job. Like jobId, contextConfig and namedObjects

Now it is easy for you to access these information from runJob.

noorul
  • 1,283
  • 1
  • 8
  • 18