Does spark standalone mode means that the executors and master are run on a single machine?If yes how can it attend parallelism. Is the value passed to set local function of spark conf set as one in standalone mode to indicate that spark application is running on single machine?
Asked
Active
Viewed 691 times
1 Answers
2
"Does spark standalone mode means that the executors and master are run on a single machine?" => No.
"standalone" in "spark standalone mode" does not mean a single machine. In Spark Standalone mode, typically there are multiple slave nodes. Spark Standalone mode is another cluster mode like Mesos and Yarn. Please refer to https://spark.apache.org/docs/latest/spark-standalone.html in detail.
I guess you are asking "Local Mode". Spark runs in Local Mode if you execute like spark-shell --master=local[4]
. In this case, spark driver and executors run on a single JVM and has multiple threads.
You can find many answers about "Spark Local Mode" if you google it.

Jason Heo
- 9,956
- 2
- 36
- 64
-
so does local[4] means one master and three slaves ie; one driver and three executors. And so parallelism is attained since this threads are executed parallel. – Aug 17 '19 at 06:20
-
@Anu *"so parallelism is attained since this threads are executed parallel"* => Yes. "*local[4] means one master and three slaves"* => Just 1 driver with 4 cores on a single JVM. After executing `$ spark-shell --master=local[4]`, visit http://localhost:4040/ and click "Executors" menu, then you can see information. If you run some spark job, you can see how many cores are working via "Active tasks" on Spark Web UI. – Jason Heo Aug 18 '19 at 02:45