I am working on art project where we are automatically taking pictures of a petri dish, which is illuminated by a small LCD screen. Oftentimes, there appears to be "glitches" in the image (see image with glitch), it always look the same, a short, narrow, dark line segment. I believe it is due to the scanline of the background screen. I have tried different methods for smoothing it out, such as taking a few images (like, 10 images, one every second) and applying an averaging or a median to generate a filtered image but it does not seem to work. I also have some previous sets of images which I would like to fix. Is there some kind of automated way to do this? (ideally in Java or Python)
Asked
Active
Viewed 138 times
0
-
1Did you try a spatial median filter (i.e., pixel value gets replaced by the median of some surround, e.g., a 3x3 neighborhood)? – Mark Lavin Feb 11 '22 at 23:57
-
1temporal median "does not seem to work"? show that. it should have. – Christoph Rackwitz Feb 12 '22 at 11:42
-
1Such "glitches" are usually an acquisition (or decompression ?) artifact, caused by the signal being temporarily lost and replaced by garbage, or a previous image line. The most challenging part is to detect them, for instance by testing for sudden changes on a single line. Then you can reconstruct the signal locally, for instance by averaging between the neighboring lines. – Feb 14 '22 at 20:05