I am running a Spark job implemented in Java using spark-submit
. I would like to pass parameters to this job - e.g. a time-start
and time-end
parameter to parametrize the Spark application.
What I tried was using the
--conf key=value
option of the spark-submit
script, but when I try to read the parameter in my Spark job with
sparkContext.getConf().get("key")
I get an exception:
Exception in thread "main" java.util.NoSuchElementException: key
Furthermore, when I use sparkContext.getConf().toDebugString()
I don't see my value in the output.
Further Notice Since I want to submit my Spark Job via the Spark REST Service I cannot use an OS Environment Variable or the like.
Is there any possibility to implement this?