I have common dataset which needs to be used in multiple programs/processes. How can I create one Spark RDD and share the same RDD in multiple scala programs?
Asked
Active
Viewed 235 times
2 Answers
0
Maybe you can have a look at IgniteRDD, which might help you to share the RDD in multiple Spark programs https://ignite.apache.org/features/igniterdd.html

Kartik Ramalingam
- 11
- 3
0
Consider Spark-Jobserver. Jobserver's main role is to be a REST interface but a side-effect is that you can keep RDDs alive and share them between jobs.

Graham S
- 1,642
- 10
- 12