I need to join two Rdds from two different ES clusters,but I found I just can create one SparkConf and SparkContext based on one ES cluster. For example the code as following:
var sparkConf: SparkConf = new SparkConf()
sparkConf.set("es.nodes", "192.168.0.22:9200")
val rdd1=sc.esRDD("userIndex1/type1")
So how can I create two RDD from different ES clusters?