I'm using Spark Job server docker
docker run -d -p 8090:8090 --name sjs --net=<network_name> -e SPARK_MASTER=spark://<host_name>:7077 velvia/spark-jobserver:0.6.2.mesos-0.28.1.spark-1.6.1
While it appears to be working, when I'm submitting a job:
curl -d 'input.config = ""' '<spark-job-server-ip>:8090/jobs?appName=my_test_app&classPath=test.testapp&sync=true
However according to the logs, the job is executed on the local Spark and not in the cluster.
What other configuration is required in order for the job to be executed in the cluster and not in the local machine?