I need help on this.
I am seeing Latency or context shift time when a function exported by a Windows DLL is called from a Windows exe.
How one concluded this is that most of the times the DLL exported function is seen to get completed in under 1 milliSec. But sometimes time-stamping the point from when the DLL function is called to when it returns is seen to be even 600 mSecs. This is causiing buffer over-flow and loss of data on the slave side. Actually it is a USB to SPI converter I am using. The DLL takes in USB feed & emits SPI data at the other end. So If this function taken upto 600 mSecs to return, I loose data on the SPI slave.
On profiling the DLL's functions they don't take more than 15 mSecs (though SPI read & write of this magnitude is also lot considering a SPI speed of 15 MHz, and we read 4 bytes).
SO is it context shift time? Would incorporating the DLL's code into my exe itself help? The only delay I see is only across this DLL's function call. Is there any way I can cause no-preemption, so that my applicatiomn gets more CPU time on a Win 7 maching. I am using Visual studio.
Please suggest. Appreciate your help on this.
Thanks, Sakul