I have a Spark cluster running in a Docker container (using an image I made myself). It's all working fine.
I now want to use Apache Livy and as per the documentation it says I need to get in a place a couple of environment variables: https://livy.incubator.apache.org/get-started/
export SPARK_HOME=/usr/lib/spark
export HADOOP_CONF_DIR=/etc/hadoop/conf
My question is as Spark is running in Docker as opposed to local installation, what options do I have to reference those 2 directories in the exports.
This is actually a common problem I face so any help on best practices would really help.
Thanks.