I'm creating a tool for our data, reading data with Entity Framework from our SQL Server database and inserting them, row by row, in our MongoDB database.
In my code, I start in this way:
_client = new MongoClient();
_database = _client.GetDatabase("isovtest");
Then, with a loop on my SQL Server table
foreach (var erog in db.transactions)
I create a Bson document for each record.
document = new BsonDocument{....}
and finally I insert it:
_database.GetCollection<BsonDocument>("transactions").InsertOne(document);
My problem is the source table is big, with some millions of records. After about 400K, it crashes every time because it goes out of memory (about 1,6 GB occupied in my RAM: at this size it crashes, also if there is still free memory).
I think the problem is the instruction
_database.GetCollection<BsonDocument>("transactions").InsertOne(document);
since, with GetCollection, it retrieve all the data, getting a grower collection every loop.
I tried to add, at the end of my loop:
document = null;
GC.Collect();
but nothing changed.
So, can you tell me if exists another way to insert a document in my MongoDB collection without using the GetCollection method? If not, how can avoid the memory problem?