I want to run apache spark history on a docker image, to achieve this I had to change spark-defaults.conf and add this line
spark.history.fs.logDirectory /path/to/remote/logs
And then run start-history-server.sh
This work fine when I set the value statically, however I want the value to be set from an environement variable that will be set on the docker container on run time, so I want something like this:
spark.history.fs.logDirectory ${env.path_to_logs}
However this doesn't work since the spark-defaults.conf deosn't access env variable, so is there a solution for this or maybe add a parameter when running start-history-server.sh ?