I'm wondering how to go about submitting spark "jobs" to a single application (allowing for sharing of RDD work but also code/execution independence of modules). I've seen this spark-jobserver formally at ooyala, but I noticed it doesn't yet support python. Is this the common route taken for this use case in scala/java? Or am I going down the wrong route here?
I've also seen a less popular pyspark-jobserver and open issues on the main spark-jobserver to address python and R.
In order to have a better understanding of the use case of spark-jobserver, I'm also wondering why this functionality wouldn't be supported directly by spark given their detailed job-scheduling framework.