1

I want to simulate realistic motion blurring. I do not want to have the blurring effect in the whole image but only on the moving objects. I know that I can use filters, but the blurring effect will be spread in the whole image. I thought to use optical flow but I am not sure that it will work because the result depends a lot on the extracted features.

My main idea is to combine successive frame in order to generate motion blur.

Thanks

  • How would you define "moving"? If a car stays in the center of the frame as the background whooshes by (because the camera is going at the same speed as the car), what should be blurred and what should be sharp? – Sneftel Sep 13 '18 at 14:42
  • For example, if a car is going around and I have a camera located so that I can see its movement. If the camera has a good fps I should see clearly the successive positions of the car. But if the camera has a weak fps the movement of the car will be blurred. My goal is to create artificially this motion blur of the car using multiple consecutive frames. –  Sep 13 '18 at 14:53
  • To be more precise, the camera is supposed to be fixed. –  Sep 13 '18 at 14:54
  • So, given a region of an image and a direction, can you apply motion blur on that region in that direction? Given a sequence of frames, can you use optical flow to generate regions of the image and its movement? What happens if you compose those two subtasks? Or, in short, what have you tried, and what went wrong? – Yakk - Adam Nevraumont Sep 13 '18 at 15:08
  • 1
    Low framerate doesn't cause motion blur. Long exposure times do. – Dan Mašek Sep 13 '18 at 15:11
  • @Yakk-AdamNevraumont I am currently exploring this solution, in particular, the dense optical flow. I'll give you a feedback. –  Sep 13 '18 at 16:43
  • @DanMašek you're right. But my problem remains the same –  Sep 13 '18 at 16:45

2 Answers2

2

Not so easy.

You can indeed try with optical flow. Estimate the flows between every pair of frames. Blur the frames in the direction of motion (for instance anisotropic Gaussian), with a filter extent equivalent to the displacement. Finally, blend the blurred images and the background by forming a weighted average where every frame gets more weight where it moves more.

0

You need to have a [0,1] alpha mask for the object. Then you can use a directional filter to blur the object and it's mask, for example, as done here: https://www.packtpub.com/mapt/book/application_development/9781785283932/2/ch02lvl1sec21/motion-blur

Then use the blurred mask to alpha blend the blurred object back into the original unblurred or other scene:

 #Blend the alpha_mask region of the foreground over the image background
 #fg is foreground, alpha_mask is a [0,255] mask, image is background scene
    foreground = fg.astype(float)
    background = image.astype(float)
    #Normalize alpha_mask
    alpha = alpha_mask.astype(float) / 255
    # Multiply the foreground with the alpha_mask
    foreground = cv2.multiply(alpha, foreground)
    # Multiply the background with ( 1 - alpha )
    background = cv2.multiply(1.0 - alpha, background)
    # Add the masked foreground and background, turn back to byte image
    composit_image = cv2.add(foreground, background).astype(np.uint8)
user1270710
  • 553
  • 1
  • 6
  • 11