There are methods to save data of org.apache.spark.sql.DataFrame
to file system or Hive. But how to save data from DataFrame
created on MongoDB data back to MongoDB?
Edit : I created DataFrame using
SparkContext sc = new SparkContext()
Configuration config = new Configuration();
config.set("mongo.input.uri","mongodb://localhost:27017:testDB.testCollection);
JavaRDD<Tuple2<Object, BSONObject>> mongoJavaRDD = sc.newAPIHadoopRDD(config, MongoInputFormat.class, Object.class,
BSONObject.class).toJavaRDD();
JavaRDD<Object> mongoRDD = mongoJavaRDD.flatMap(new FlatMapFunction<Tuple2<Object, BSONObject>, Object>()
{
@Override
public Iterable<Object> call(Tuple2<Object, BSONObject> arg)
{
BSONObject obj = arg._2();
Object javaObject = generateJavaObjectFromBSON(obj, clazz);
return Arrays.asList(javaObject);
}
});
sqlContext = new SqlContext(sc);
DataFrame df = sqlContext.createDataFrame(mongoRDD, Person.class).registerTempTable("Person");