I'm looking into spark-core, I found one undocumented config, which is spark.executor.allowSparkContext available since 3.0.1. I wasn't able to find detail in spark official documentation. In code, there is short description for this config
If set to true, SparkContext can be created in executors.
But I wonder that, How can SparkContext
be created in executors? As far as I know SparkContext
is created on driver, and executors are assigned by resource manager. So SparkContext
is always created before executors.
What is the use case of this config?