The answer I posted last night didn't pass regression testing. I got mixed results and file system watcher's different filter settings, and it occasionally missed files which is unacceptable. There are lots of articles on problems with network shares but I take that to mean the watcher is watching a network share mapped to a different computer, not that the directory being watched is itself a network share on the same machine where the service is running. It is possible latency is a factor in some of my mis-fires, but even locally, the component does not seem to recognize multi-dot files names.
Since this is a service, we already had a method to detect any files already present when the service started. This method worked and had no reliance on the component. So the most elegant solution was to simply put that code on a timer. Below are the relevant parts of the class (i.e. this snippet isn't designed to be copy / paste ready, only to show how I solved the problem). Don't let the FTP moniker throw you off course - it is really just watching a shared folder which might or might not be mapped to an FTP server.
using System.Collections.Generic;
using Timer = System.Timers.Timer;
public partial class VsiFtpManager : ServiceBase
{
private Timer _searchTimer;
private Queue<string> _filesToProcess;
private string _ftpRoot; //this is set elsewhere from the registry
protected override void OnStart(string[] args)
{
//process any files that are already there when the service starts
LoadExistingFtpFiles();
//Handle new files
_searchTimer = new Timer(10000);
_searchTimer.Elapsed += LoadExistingFtpFiles;
}
//convenience overload to allow this to handle timer events
private void LoadExistingFtpFiles(object source, ElapsedEventArgs evtArgs)
{
LoadExistingFtpFiles();
}
private void LoadExistingFtpFiles()
{
_searchTimer.Stop();
var di = new DirectoryInfo(_ftpRoot);
FileInfo[] fileInfos = di.GetFiles("*.*", SearchOption.AllDirectories);
foreach (FileInfo fi in fileInfos.Where(fi => fi != null))
{
if (fi.Extension != "processed" && !_filesToProcess.Contains(fi.FullName))
{
LogHelper.BroadcastLogMessage("INFO: File " + fi.Name + " was uploaded.", EventLogEntryType.Information);
_filesToProcess.Enqueue(fi.FullName);
LogHelper.BroadcastLogMessage("File received: " + fi.Name, EventLogEntryType.Information);
}
}
_searchTimer.Start();
}
}
The part you don't see, which is beyond the scope of my question, is essentially a co-routine running against the queue _filesToProcess which processes the files then renames them to have an extension of .processed.
So my final answer: My research, borne out by automated regression testing, showed file system watcher to be unreliable for my use case, which requires me to process files copied into a folder. Some of these files will come from Unix systems and so may have non-windows file names. The file system watcher component shipping with .net cannot reliably detect unix style files names bearing multiple dots within the name.
I replaced the file system watcher with a simple polling mechanism. The execution is significantly slower but reliable, which is my main objective. The overall solution reduced my lines of code, albeit insignificantly, and removed my only component on the design surface of the service, both of which I consider bonuses owing to my own possibly peculiar preferences.