I often see this kind of piece of code as an example of aggregating object enumerables in c#:
IEnumerable<MyCoolObject> myObjects = new List<MyCoolObject>()
{new MyCoolObject(){Value1=1, Value2=10}, new MyCoolObject(){Value1=2, Value2=20}};
MyCoolObject aggregatedObject = myObjects.Aggregate(new MyCoolObject(), (accumlator, next) => new MyCoolObject()
{
Value1=accumlator.Value1 + next.Value1,
Value2=accumlator.Value2 + next.Value2
}
My problem with this approach is it creates a new MyCoolObject every iteration, wich seems like a huge waste.
The other common example is this:
MyCoolObject aggregatedObject = new MyCoolObject()
{
Value1=myObjects.Sum(x=>x.Value1),
Value2=myObjects.Sum(x=>x.Value2)
}
This one iterates my collection twice, also a big waste, espetially, if there are more fields to aggregate on my objects.
What I figured is, I could do this:
MyCoolObject aggregatedObject = myObjects.Aggregate(new MyCoolObject(), (accumlator, next) =>
{
accumlator.Value1 += next.Value1;
accumlator.Value2 += next.Value2;
return accumlator;
};
This one creates a single accumlator object, works on it, and returns it, when finished. To me this looks to be on par with a manual foreach loop performance-wise. I'm surprised, that I don't see this solution often. Are there any problems this solution can introduce, that could explain this?