I would like to create a cached context using Apache Livy, such that while submitting spark jobs, it does not require creating a new context every time.
Asked
Active
Viewed 114 times
0
-
Have you seen https://gethue.com/how-to-use-the-livy-spark-rest-job-server-api-for-sharing-spark-rdds-and-contexts/? – Sai Sep 17 '21 at 23:47
-
Yes. But there, they mentioned about shared contexts while using interactive sessions, not for submitting jars – Abhisek Ray Sep 20 '21 at 03:33