I've got written a service that has a separate thread running that reads roughly 400 records from a database and serializes them into xml files. It runs fine, there are no errors and it reports all files have been exported correctly, yet only a handful of xml files appear afterwards, and its always a different number each time. I've checked to see if it's a certain record causing problems, but they all read out fine, and seem to write fin, but don't...
After playing around and putting a delay in of 250ms between each write they are all exported properly, so I assume it must have something to do with writing so many files in such a quick succession, but I have no idea why, I would have thought it would report some kind of error if they didn't write properly, yet there's nothing.
Here is the code for anyone who wants to try it:
static void Main(string[] args)
{
ExportTestData();
}
public static void ExportTestData()
{
List<TestObject> testObjs = GetData();
foreach (TestObject obj in testObjs)
{
ExportObj(obj);
//Thread.Sleep(10);
}
}
public static List<TestObject> GetData()
{
List<TestObject> result = new List<TestObject>();
for (int i = 0; i < 500; i++)
{
result.Add(new TestObject()
{
Date = DateTime.Now.AddDays(-1),
AnotherDate = DateTime.Now.AddDays(-2),
AnotherAnotherDate = DateTime.Now,
DoubleOne = 1.0,
DoubleTwo = 2.0,
DoubleThree = 3.0,
Number = 345,
SomeCode = "blah",
SomeId = "wobble wobble"
});
}
return result;
}
public static void ExportObj(TestObject obj)
{
try
{
string path = Path.Combine(@"C:\temp\exports", String.Format("{0}-{1}{2}", DateTime.Now.ToString("yyyyMMdd"), String.Format("{0:HHmmssfff}", DateTime.Now), ".xml"));
SerializeTo(obj, path);
}
catch (Exception ex)
{
}
}
public static bool SerializeTo<T>(T obj, string path)
{
XmlSerializer xs = new XmlSerializer(obj.GetType());
using (TextWriter writer = new StreamWriter(path, false))
{
xs.Serialize(writer, obj);
}
return true;
}
Try commenting\uncommenting the Thread.Sleep(10) to see the problem
Does anybody have any idea why it does this? And can suggest how I can avoid this problem?
Thanks
EDIT: Solved. The time based filename wasn't unique enough and was overwriting previously written files. Should've spotted it earlier, thanks for your help