I have some code that uses deferred execution and lazy loading:
public static IEnumerable<XElement> GetStreamElementP(string fileId, ListProgressEventHandler progressHandler, int total)
{
var filePath = Utility.GetEContentFilePath(fileId);
using (var reader = XmlReader.Create(filePath, new XmlReaderSettings { IgnoreWhitespace = true, }))
{
var cnt = 0;
reader.MoveToContent();
// Parse the file and display each of the p nodes.
reader.Read();
while (reader.NodeType == XmlNodeType.Element && reader.Name == "p")
{
cnt++;
var returnedValue = XElement.ReadFrom(reader) as XElement;
int rem = cnt % _streamElementCallBackSize;
if (progressHandler != null && rem == 0)
{
progressHandler(null, new ListProgressEventArgs { ItemsProcessed = cnt, TotalItemsToProcess = total, });
}
yield return returnedValue;
}
reader.Close();
}
}
I'm looking to get a simple count on the number of elements. The current code we are using is:
public static int FileElementsCount(string fileId)
{
var cnt = 0;
foreach (XElement e in GetStreamElementP(fileId))
{
cnt++;
}
return cnt;
}
Can I improve this to?
public static int FileElementsCount(string fileId)
{
return GetStreamElementP(fileId).Count<XElement>();
}
Or will this cause more memory to be used when getting the count? We are dealing with very large files in some cases and attempting to keep memory usage to a minimum where possible.
I have tried to find a concrete example that explains how the memory is used in each case without any success.
Thanks in advance for any help.