I am currently experiencing a serious problem. I am scanning files in a directory, then I process them [read contents with File.ReadAllText(fi.FullName)], and then deleting the file. The thread sleeps for 200 ms, and starts again (scan, process, delete). The issue is that sometimes, I can see that a file that has been already deleted it appears in the next scan, this does not happen always, only occasionally.
List<FileInfo> files = GetFiles();
if (files != null)
{
foreach (FileInfo fi in files)
{
if (ProcessFile(fi))
{
fi.Delete();
log.Info("Report file: " + fi.FullName + " has been deleted");
}
}
}
And here is the GetFiles method
internal List<FileInfo> GetFiles()
{
try
{
DirectoryInfo info = new DirectoryInfo(scanDir);
List<FileInfo> files = info.GetFiles().OrderBy(p => p.CreationTime).Take(10).ToList(); //oldest file first
return files;
}
catch (Exception ex)
{
log.Error("Getting files from directory: " + scanDir + ". Error: " + ex.ToString());
return null;
}
}
I have read in other posts that the FileInfo.Delete() takes some time, but Microsoft documentation does not say anything about this. So I am not sure as to what is happing. Can anybody spot anything wrong with the code? Is there any official documentation as to whether the fileInfo.Delete() is a blocking call? or does it simple marks a file for deletion?
EDIT and here is the only reference to the FileInfo in the ProcessFile
string message = File.ReadAllText(fi.FullName);
I believe that the File.ReadAllText closes the file, and that no handles should be left around please correct me if wrong!...also, this only happens occasionally, and not to all files (I am processing 10 files, and it happens to just 1)