0

I am considering frames of a video of a single object (black and white chess board) where there is a certain blur moving to different spots on the object as time goes on.

What I need is to determine when the blur ceases to be.

I was thinking of importing the frames from the video and subtracting each of them to a picture of the clean object taken in stable conditions where I know the blur is not present and minimize this difference.

What do you reckon could be a way to implement this using Python? Given the presence of other environmental noise beside the known blur, should I manually check the images obtained by subtraction or is there a known function to keep this in consideration when analysing image differences for such a specific change?

user49719
  • 11
  • 1

0 Answers0