I am creating a program where I show some graphical content, and I record the face of the viewer with the webcam using DirectShow. It is very important that I know the time difference between what's on the screen to when the webcam records a frame.
I don't care at all about reducing latency or anything like that, it can be whatever it's going to be, but I need to know the capture latency as accurately as possible.
When frames come in, I can get the stream times of the frames, but all those frames are relative to some particular stream start time. How can I access the stream start time, for a capture device? That value is obviously somewhere in the bowels of directshow, because the filter graph computes it for every frame, but how can I get at it? I've searched through the docs but haven't found it's secret yet.
I've created my own IBaseFilter IReferenceClock implementing classes, which do little more than report tons of debugging info. Those seem to be doing what they need to be doing, but they don't provide enough information.
For what it is worth, I have tried to investigate this by inspecting the DirectShow Event Queue, but no events concerning the starting of the filter graph seem to be triggered, even when I start the graph.
The following image recorded using the test app might help understand what I'm doing. The graphical content right now is just a timer counting seconds. The webcam is recording the screen. At the particular moment that frame was captured, the system time was about 1.35 seconds or so. The time of the sample recorded in DirectShow was 1.1862 seconds (ignore the caption in the picture). How can I account for the difference of .1637 seconds in this example? The stream start time is key to deriving that value.
The system clock and the reference clock are both using the QueryPerformanceCounter() function, so I would not expect it to be timer wonkyness.
Thank you.