This is a follow-on question to this one: XmlSerializer extraTypes memory leak
I have used the technique of having a single static call using the extratypes overload like this:
static readonly XmlSerializer xmlSerializer = new XmlSerializer(typeof(MyDeviceType[]), MyDeviceTypes);
where MyDeviceTypes looks like this:
static readonly Type[] MyDeviceTypes = { typeof(DeviceType1), typeof(DeviceType2) };
The problem I am seeing now is that I am getting a memory jump every time I call Serialize:
Logger.LogError("SaveDevices: Before Serialize call - " + GetMemoryUsage());
xmlSerializer.Serialize(xwriter, devices);
Logger.LogError("SaveDevices: After Serialize call - " + GetMemoryUsage());
So my log looks like this:
9/28/2016 5:14:32 PM SaveDevices: Before Serialize call - 344,182,784
9/28/2016 5:14:36 PM SaveDevices: After Serialize call - 359,600,128
and I see the same increases every time Serialize is called with memory never being released, eventually causing an Out of Memory error.
I tried setting the XmlSerialization.Compilation switch in app.config and I don't get any source code to peek at, but I do see a Microsoft.GeneratedCode.dll and Microsoft.GeneratedCode.pdb file. When I watch my application run, it appears that this file is regenerated every time Serialize is called.
How can I modify this behavior so it doesn't keep consuming memory on each Serialize call and regenerate the code each time?
Update: Replacing the my complex data types with simple test data types stops the strange behavior from occurring (i.e. no memory leak and no regeneration of the Microsoft.GeneratedCode). What in my complex data types could cause the calls to Serialize to decide it needs to regenerate each time it is called?