1

I am experimenting with undistorting images. I have the following image below, and using the undistort function have the result. The fisheye module doesn’t work. Is this because my image isn’t a fisheye but wide angle instead? And in either case how do I reduce the perspective distortion?

FYI I have lost the specs on this lens but got its intrinsics from a calibration routine.

input image: distorted output image undistorted better IR illumination enter image description here

user121903
  • 57
  • 1
  • 1
  • 5
  • 1
    that lens looks severely out of focus, and underexposed. I doubt that anyone could tell what can be seen in the pictures. – Christoph Rackwitz Oct 02 '21 at 19:07
  • these are bandpass filtered at ~835nm + and are optimized for IR tracking. The edges I am interested in appear pretty clearly. I can see the top one with curved edges and the bottom with straight. The keypoint isn't shown. Please let me know if you need something else to answer this question. – user121903 Oct 02 '21 at 19:29
  • I'll also post a new image with better illumination. – user121903 Oct 02 '21 at 19:36
  • Better illumination uploaded below # 3. Thx – user121903 Oct 02 '21 at 19:55
  • ah, infrared. okay. so the latest picture, undistorted, looks fairly straight to me, without checking with a ruler or anything. OpenCV's "fisheye" is also just <180 degrees (>=180 is "super-fisheye", not in opencv yet), but with more suitable equations for those lenses, compared to "regular" lenses for FoV in the range of 60-90 degrees – Christoph Rackwitz Oct 03 '21 at 00:03
  • So I can continue to use the undistort function instead of the fisheye::undistort function? – user121903 Oct 03 '21 at 00:16
  • from just looking, I can't judge that finally. the undistorted pictures you show here *look decent*. whether they are, you'll have to test/measure and judge for yourself against whatever you plan to do with it all. intrinsic calibration is often very sensitive to small things you do right/wrong when waving the calibration patterns around. – Christoph Rackwitz Oct 03 '21 at 02:08

1 Answers1

3

The image seems just wide-angle, not fisheye. Images from fisheye camera usually have black circle borders, and they look like seeing through a round hole. See picture c) below (from OpenCV doc):

The normal method of distinguishing wide-angle from fisheye is to check the FOV angle.

Given the camera intrinsic parameters (the cameraMatrix and distCoeffs, from calibration routine), you can calculate a new camera intrinsic matrix with the largest FOV and no distortion by calling getOptimalNewCameraMatrix(). Then the FOV angle in x-direction (it's usually larger than y-direction) is arctan(cx/fx)+arctan((width-cx)/fx), where fx is the focal length in x-direction, cx is the x-coordinate of principal point, and width is the image width.

In my experience, when FOV<80°, the Brown distortion model (k1, k2, k3, p1, p2) should be used. When 80°<FOV<140°, the Rational model (k1~k6, p1, p2) should be used. And when 140°<FOV<170°, the Fisheye model (k1, k2, k3, k4) should be used. More complex model (with more parameters) has better fitting ability, but also makes the calibration harder.

The Fisheye model performs better than Rational model in very large FOV cases, because it has higher order radial parameter. But when FOV angle is not very large, Rational model is a better choice, because it has tangential parameters.

The undistorted picture you provided seems fairly good. But if you care more about the precision, you should check the reprojection error, epipolar error (multi-camera), and even collineation error of the key points (checkerboard corners, circle centroids, or whatever features depending on your calibration pattern) in calibration procedure.

VictorBian
  • 675
  • 1
  • 4
  • 17
  • I hope you don't mind me asking here, but your answer is very interesting and it might answer a problem that I also have. So if I wanted to find out the FOV from a camera matrix and distortion coefficients returned by OpenCV calibration, I need to use getOptimalNewCameraMatrix() before using the arctan formula? – Otter_warrior Nov 29 '21 at 17:31
  • Do you have a source explaining the `arctan(cx/fx)+arctan((width-cx)/fx)` formula? – Otter_warrior Nov 29 '21 at 17:39
  • @Otter_warrior The focal length from `getOptimalNewCameraMatrix()` is usually smaller than that in the origin camera matrix (even when `alpha` is set to 0), thus it will give you larger FOV. I think the FOV calculated in this way is more close to the real value of the lens. – VictorBian Nov 30 '21 at 09:16
  • @Otter_warrior Sorry, I don't have a source explaning this formula. The usual formula is `2*arctan(width*0.5/fx)`, but I think it's more accurate to take `cx` into consideration. – VictorBian Nov 30 '21 at 09:20
  • 1
    @Otter_warrior The FOV I described above is more close to the **physical** value of the lens. But in real application, it depends on **how you undistort your image**, i.e. the `newCameraMatrix` in `initUndistortRectifyMap()`. – VictorBian Nov 30 '21 at 09:29
  • thanks a lot for your help! – Otter_warrior Dec 01 '21 at 17:42