In my tests I am reading a text file line by line and inserting an entity along with other related entities. The problem is when too many are inserted I receive an Out of memory exception.
In my attempt to prevent this I create a new DbContext for every 50 rows and dispose of the old one. It was my understanding that this would free up memory from the earlier entity operations, but the memory continues to climb and if the file is big enough an out of memory exception occurs. This is related to the entity code as if I remove the lines of code that adds the entity the memory stays at a consistent usage.
Below is a simplified version of my code.
public class TestClass
{
public void ImportData(byte[] fileBytes)
{
using (Stream stream = new MemoryStream(fileBytes))
{
TextFieldParser parser = new TextFieldParser(stream);
parser.TextFieldType = FieldType.Delimited;
parser.SetDelimiters(",");
while (!parser.EndOfData)
{
//Processes 50 lines creates a new DbContext each time its called
ImportBatch(parser);
}
}
}
public void ImportBatch(TextFieldParser parser)
{
using(myDbContext context = new myDbContext())
{
context.Configuration.AutoDetectChangesEnabled = false;
int batchCount = 0;
while (!parser.EndOfData && batchCount < 50)
{
string[] fields = parser.ReadFields();
//Here I call some code that will add an entity and add releated entities
//In its navigation properties
MyService.AddMyEntity(fields,myDbContext);
batchCount++;
}
myDbContext.ChangeTracker.DetectChanges();
myDbContext.SaveChanges();
}
}
}
As I am disposing and creating a new context every 50 inserts I would expect the memory usage to stay constant, but it seems to be constant for the first 2 thousands rows but after that the memory constantly climbs, unitl an OutOfMemory exception is hit.
Is there a reason why disposing of a dbContext in the following fashion would not result in the memory being released?
EDIT - added some simplified code of my add entity method
public void AddMyEntity(string[] fields, MyDbContext, myDbContext)
{
MyEntity myEntity = new MyEntity();
newRequest.InsertDate = DateTime.UtcNow;
newRequest.AmendDate = DateTime.UtcNow;
//If I remove this line the memory does not consistently climb
myDbContext.MyEntities.Add(myEntity);
foreach(string item in fields)
{
ReleatedEntity releatedEntity = new ReleatedEntity();
releatedEntity.Value = item;
newRequest.ReleatedEntities.Add(releatedEntity);
}
}
Another Edit
Turns out after more testing it is something to do with Glimpse profiler. I have included Glimpse in my project and the web config has a section similar to below.
<glimpse defaultRuntimePolicy="On" endpointBaseUri="~/Glimpse.axd">
<tabs>
<ignoredTypes>
<add type="Glimpse.Mvc.Tab.ModelBinding, Glimpse.Mvc5"/>
<add type="Glimpse.Mvc.Tab.Metadata, Glimpse.Mvc5"/>
</ignoredTypes>
</tabs>
<inspectors>
<ignoredTypes>
<add type="Glimpse.Mvc.Inspector.ModelBinderInspector, Glimpse.Mvc5"/>
</ignoredTypes>
</inspectors>
Turning defaultRuntimePolicy to Off fixed the memory leak. Still not sure why tho