I have read some literature about Spark task scheduling, and found some paper mentioned that the Executor is monopolized by only one application at every moment. So I wandering about whether the task slots in one executor can be shared by different Spark applications at the same time?
Asked
Active
Viewed 187 times