With Spark write mode as append and MongoDB option replaceExisting -> false we could add new fields to the existing MongoDB document.
val wconfig = WriteConfig(Map("uri" -> uri, "replaceDocument" -> "false"))
dataframe.write.format("mongo").options(wconfig.asOptions).mode("append").save
Not able to find an option to push an element to existing array with Spark
In Spark Mongo Connector, is there an option to push elements into existing arrays with Spark MongoDB connector?
Consider the scenario:
MongoDB: {_id: 123, field1 : [‘a’, ‘b’], field2: ‘value’ }
Spark Dataframe: {_id: 123, field1 : [‘c’, ‘d’]}
Current Output: {_id: 123, field1 : [‘c’, ‘d’], field2: ‘value’ }
Expected Output: {_id: 123, field1 : [‘a’, ‘b’, ‘c’, ‘d’], field2: ‘value’ }
Is the expected output possible with Spark Mongo connector?
Thank you.