1

I am trying to read data from a remote MongoDB instance from a c# console application but keep getting an OutOfMemoryException. The collection that I am trying to read data from has about 500,000 records. Does anyone see any issue with the code below:

var mongoCred = MongoCredential.CreateMongoCRCredential("xdb", "x", "x");
var mongoClientSettings = new MongoClientSettings
{
    Credentials = new[] { mongoCred },
    Server = new MongoServerAddress("x-x.mongolab.com", 12345),
};

var mongoClient = new MongoClient(mongoClientSettings);
var mongoDb = mongoClient.GetDatabase("xdb");
var mongoCol = mongoDb.GetCollection<BsonDocument>("Persons");
var list = await mongoCol.Find(new BsonDocument()).ToListAsync();
johnny 5
  • 19,893
  • 50
  • 121
  • 195
degmo
  • 177
  • 1
  • 18
  • Are you really trying to fetch all 500k documents at once? What is the use case? – Joachim Isaksson Apr 04 '16 at 19:00
  • 2
    You're trying to fit your entire Persons collection into memory, at once. You don't have enough memory. You either need a more constrained result set, or you need to process your results in a streaming manner, e.g. with `ForEachAsync` rather than `ToListAsync` – Preston Guillot Apr 04 '16 at 19:00
  • @JoachimIsaksson I don't need all 500k at once. The goal is to get all the records to a CSV file. I plan on massaging some of the fields slightly. – degmo Apr 04 '16 at 19:34
  • are you compiling witx x86 or x64 architecture? – profesor79 Apr 05 '16 at 10:23
  • so @PrestonGuillot advice is one to go for you or read in chunks – profesor79 Apr 05 '16 at 17:06
  • @degmo, under the covers it's creating a larger and larger list but you shouldn't get an out of memory exception for this you should simply get increased paging if you don't have enough RAM to store the entire list in memory. While I don't see anything immediately wrong with the sample code you have above, can you give me more information about the person object? I'm wondering if we're having a serialization error based on the data that you have in your documents in Mongo – dacke.geo Oct 24 '17 at 00:37

1 Answers1

2

This is a simple workaround: you can page your results using .Limit(?int) and .Skip(?int); in totNum you have to store the documents number in your collection using

coll.Count(new BsonDocument) /*use the same filter you will apply in the next Find()*/

and then

for (int _i = 0; _i < totNum / 1000 + 1; _i++)
{
    var result = coll.Find(new BsonDocument()).Limit(1000).Skip(_i * 1000).ToList();
    foreach(var item in result)
    {
        /*Write your document in CSV file*/
    }
}

I hope this can help... P.S. I used 1000 in .Skip() and .Limit() but, obviously, you can use what you want :-)

Lanny
  • 21
  • 1