How would you determine a safe max threshold value for the max-jobs-per-context setting, which controls the number of concurrent Spark jobs that are running on a context? What would happen if you go too high? The default is set to 8 (see link below), and I'd like to set it higher, but I'm not sure what happens if you set it too high.
Asked
Active
Viewed 873 times
1 Answers
2
An approach that we are using in production is to put a queue in front of spark jobserver and control job submission. There is no inbuilt queuing mechanism in SJS.

noorul
- 1,283
- 1
- 8
- 18