I have this design problem. I have a bulk update/insert feature built into my EF generic repository.
Two things can happen within that feature: I'm just adding the entity to the context and incrementing a counter or actually committing changes to the database ie: SaveChanges() (counter == commit treshold)
When I am calling SaveChanges, I also like to Dispose of the context and recreate it as a way to clear up ressources.
The thing is that I lose the ability to update entities after disposing. Here's my Update method:
public void Update(T entity, int batchSize)
{
try
{
System.Data.EntityState entityState = (System.Data.EntityState)BLHelper.GetValue(entity, "EntityState");
if (entityState == System.Data.EntityState.Detached)
this.Attach(entity);
//The next line will work so as long as QueueContextChanges() did not
//save changes and then Dispose of the context...
//After that, I'll get an exception that:
//'ObjectStateManager does not contain an ObjectStateEntry with a reference to an object...'
_context.ObjectStateManager.ChangeObjectState(entity, System.Data.EntityState.Modified);
QueueContextChanges(batchSize); //Just increments a counter or calls SaveChanges()
}
catch
{
throw;
}
}
public void Attach(T entity)
{
_objectSet.Attach(entity);
}
public void QueueContextChanges(int batchSize)
{
if (_commitCount == _commitThreshold || _counter == batchSize || batchSize == 1)
{
try
{
SaveChanges();
}
catch
{
//throw;
}
_commitCount = 0;
}
else
_commitCount++;
_counter++;
}
public void SaveChanges()
{
try
{
_context.SaveChanges();
_context.Dispose();
_context = null;
_context = SelectContext<T>(); //This methods knows which context to return depending of type of T...
}
catch
{
//TODO
}
}
Can you think of a better (and working) design ? The ability to clear memory usage while adding/updating 500,000 rows is kind of important for the moment.
Can detaching entities after SaveChange() help save on ressources so that I won't have to dispose of the context ?
Thanks.