I scouted around to find the right solution for inserting a large amount of documents to MongoDB using Mongoose.
My current solution looks like this:
MongoClient.saveData = function(RecordModel, data, priority, SCHID, callback){
var dataParsed = parseDataToFitSchema(data, priority, SCHID);
console.log("Model created. Inserting in batches.");
RecordModel.insertMany(dataParsed)
.then(function(mongooseDocuments) {
console.log("Insertion was successful.");
})
.catch(function(err) {
callback("Error while inserting data to DB: "+err);
return;
})
.done(function() {
callback(null);
return;
});
}
But it appears to me there are other offered solutions out there. Like this one: http://www.unknownerror.org/opensource/Automattic/mongoose/q/stackoverflow/16726330/mongoose-mongodb-batch-insert
Using collection.insert
. How is that different to the Model.insertMany
?
Same goes for update, my previous question: What is the right approach to update many records in MongoDB using Mongoose
Asks how do I update big chunk of data with Mongoose, defined by _id
. The answer suggests to use collection.bulkWrite
while I am under impression Model.insertMany
can do it too.