-1

I'm making a camera calibration toolkit. I'm using cv::calibrateCamera(...) funtion from opencv2/calib3d.hpp and this works fine for images which taken by fish-eye camera, at least undistorted image looks correct. However when I try this with images from my phone camera it produce interesting results that you can see below.

Result Isn't that function proper to calibrate this kind camera? Which function that I have to use to calibrate this correctly?

rawrex
  • 4,044
  • 2
  • 8
  • 24
newww
  • 1
  • 1

1 Answers1

0

Your pattern needs to be flat. I see that the sheet of paper curls. That's only a minor problem here though.

You must make sure to hold the camera still. Rolling shutter and motion together cause distortion that can't be calibrated away.

For good calibration you need to move the camera/pattern so the pattern reaches into the corners of the view, where lens distortion is most severe. You need to cover the whole view. Not in a single picture, but in all pictures taken together.

If you don't cover the corners, the numerical optimization will have no data there and can introduce distortion of its own, as you can see.

Those are "just" practical issues. You also need a theoretical foundation. Plugging APIs together doesn't give understanding. The standard book on the matter is

Multiple View Geometry in Computer Vision by Hartley & Zisserman

Christoph Rackwitz
  • 11,317
  • 4
  • 27
  • 36