0

I'm using Spark Job server docker

docker run -d -p 8090:8090 --name sjs --net=<network_name> -e SPARK_MASTER=spark://<host_name>:7077 velvia/spark-jobserver:0.6.2.mesos-0.28.1.spark-1.6.1

While it appears to be working, when I'm submitting a job:

curl -d 'input.config = ""' '<spark-job-server-ip>:8090/jobs?appName=my_test_app&classPath=test.testapp&sync=true

However according to the logs, the job is executed on the local Spark and not in the cluster.

What other configuration is required in order for the job to be executed in the cluster and not in the local machine?

Boris
  • 443
  • 8
  • 15
  • Are you sure that the docker container can reach the spark master machine over network? This could be a network issue. – noorul Jul 25 '16 at 04:20
  • No, there is no networking issue. The master and workers are getting notifications, but the actual job is being executed on the SJS spark. After an investigation, we have found that the distributed sections of the jobs do get executed on the workers, however, the main section is getting executed on the machine it was submitted on. – Boris Jul 28 '16 at 14:41

0 Answers0