I'm currently in the process of trying to calibrate a stereo endoscope using the openCV framework and have some trouble interpreting the results I'm getting.
I collected a clean dataset of 70 chessboard target images (size: 7x6), which are distributed approximately uniformly in the visible working space of about 30-110mm in front of the endoscope, while being randomly rotated. Clean in this context means that the corners returned by openCVs' detectChessboardCorners function are as accurate as they can get, given the image quality.
When using the standard openCV stereoCalibrate() function, without any modification to the default flags, I get reasonable results with reprojection errors mostly being < 1px.
The only issue I have with the result is that some of the parameters are not what I expected. Most notably, I know for a fact that we have a toe-in angle (y-axis rotation) of about 3.9°, whereas the openCV calibration routine always returns angles < 1.0°.
I experimented a bit and found that the only time I get the correct toe-in angle from the calibration is when I completely disable the radial distortion (setting cv2.CALIB_FIX_K1, cv2.CALIB_FIX_K2 and CALIB_FIX_K3 to 0) and only allow for tangential distortion (the p1 and p2 parameters). However, doing so results in reprojection errors of ~2px. Whatever other distortion model I use (and I think I tried all possible permutations) always results in the wrong angles (while still resulting in subpixel reprojection errors).
Has anyone a reasonable explanation for this? I'm quite confused by the fact that I get such low reprojection errors, even though the stereo camera parameters are definitely wrong.