2

I would like to know if it is true that on mesos we can have only one executor per node?

Context I am running a spark-submit (Spark 2.0.1) job on a cluster of 5 nodes (workers) each with 80 CPU and 512 GB memory in coarse-grained mode.

Official documentation of Spark Running Spark on Mesos in Mesos Run Modes section, says that in coarse-grained mode (default) I can set two parameters: spark.executor.memory, spark.executor.cores and that spark.cores.max/spark.executor.cores will give me number of executors.

Question Is this correct or not?

I have been playing with spark-submit setup for a week now and the maximum number of executors I was able to get on my cluster is 5 (1 for driver and 4 for actual work). This is based on the Executors tab in Spark UI.

I have seen this StackOverflow question: Understanding resource allocation for spark jobs on mesos Where it is said:

In coarse-grained mode, Spark launch only one executor per host

In Mastering Apache Spark Schedulers in Mesos section it says

In coarse-grained mode, there is a single Spark executor per Mesos executor with many Spark tasks.

Which I don't understand what it means. Is there always only one Mesos_executor per node, and that implies one Spark_executor per node?

If all of this is not true and I can have more executors.

Question Is there some mesos setting that limits number of executors?

astro_asz
  • 2,278
  • 3
  • 15
  • 31

1 Answers1

1

It is not true (anymore). SPARK-5095 Support launching multiple mesos executors in coarse grained mesos mode has been resolved in Spark 2.0 and according to the merged PR:

This PR implements two high-level features. These two features are co-dependent, so they're implemented both here:

  • Mesos support for spark.executor.cores
  • Multiple executors per slave
Community
  • 1
  • 1
  • Could you please clarify in your answer, it is not clear from the link what this setting actually do. Will spark.executor.cores allow to launch multiple-mesos executors per slave each with one JVM, multiple spark executors and specified number of cores? Or one mesos executor per slave with multiple JVM each with multiple spark executors each with specified number of cores? Other? SparkUI Executors tab shows mesos or spark executors? spark.executor.cores referes to spark executor or mesos executor? – astro_asz Feb 01 '18 at 08:22
  • Ok, I confirm that it works but ONLY when I put the option spark.executor.cores into SparkSession and it does NOT work when I put it in spark-submit option --executor-cores. Why? – astro_asz Feb 02 '18 at 08:29