I have implemented a microscope which acquires images of different slices (z-direction) within a specimen. To improve the acquisition speed, the change of the focal plane and the acquisition run (more or less) independent: the focal plane, defined by the position of the illumination light and the focal plane are driven by a NI-DAQ Card. The camera is triggered by software. The system acquires continuously, that means
- Position 1 --> Image 1
- Position 2 --> Image 2
- ...
- Position 20 --> Image 20
- Position 1 --> Image 21 etc.
So, ideally image sequence consisting of only one position is achieved by using each 20th image of the complete sequence. The total sequence consists of up to 15000 images. These consist of 20 different slices/positions.
Unfortunately, the system is not synchronised accurately. By using the suggested approach, the positions drift slightly in experiments over a few minutes.
I'm working in parallel in a better synchronisation, but as I already have important experiments done, it would be great to find a reconstruction method that reduces this "drifting" effect. I already tried using the the driving signals of the light and the focal plane, the time stamps of the acquisitions and even slightly using focus measures.
Something I am thinking about, is pattern-matching to find the best matching for each plane. In the best case the algorithm should be robust and fast.
Does anybody have a suggestion or other experiences to share? By the way I'm working with Matlab, but this is more a general problem.