0

I am working on drone stabilization close to walls using a camera. For this to work I need to extract the motion the camera makes relative to the wall. For now I used an expanded OpenCV example which uses the goodFeaturesToTrack command to find feature points in every frame. These feature points are then tracked into the next frame using calcOpticalFlowPyrLK which uses the Lucas-Kanade method. I then subtract the point locations to calculate the displacement. Adding all displacements together gets me the total displacement from the first frame. (in between I did some averaging and filtering).

The results I get do not look like the motion of the camera at all. The motion goes in any direction. Does anybody have any idea what's going wrong? Am I using the wrong algorithm for a problem like this?

General Grievance
  • 4,555
  • 31
  • 31
  • 45
  • 2
    Your camera could change the distance from the wall. How do you calculate that displacement? – jasal May 26 '14 at 12:19
  • 2
    you need to find the external camera parameters (calibration) of your camera. By using the opticalFlow you track the features in 2D image space. If you can track the same features over multiple images you can use that information to compute the calibration. I've no deep experience in that topic, so I can't help further. – Micka May 26 '14 at 12:44
  • It's almost impossible to tell what you are doing wrong without some sample information. If i'd had to guess, i'd say it would have something to do the combination of rotation & translation. – Nallath May 26 '14 at 15:23
  • I think the standard example from opencv samples/cpp/phase_corr.cpp should help. – Andrey Smorodov May 27 '14 at 06:06

0 Answers0