4

I have problems with calibrating two cameras: first is rgb and second is infrared. They have different resolution (I resized and croped bigger image), focal length, etc...

Examples:

RGB 1920x1080 RGB 1920x1080

Infrared 512x424 Infrared 512x424

How to calibrate them to each other? What parameters should I use in stereoCalibrate. Default sample stereo_calib.cpp produce very big error. Something like this: https://www.dropbox.com/s/x57rrzp1ejm3cac/%D0%A1%D0%BA%D1%80%D0%B8%D0%BD%D1%88%D0%BE%D1%82%202014-04-05%2012.54.17.png

done with RMS error=4.1026

average reprojection err = 10.2601

UPDATE

I generated calibration parameters for each cameras independently using calibration.cpp example. For RGB camera I first resize and crop image to resolution matches IR camera (512x424), then calibrate. For RGB camera I get camera.yml, for IR camera I get camera_ir.yml. Then I try make stereo calibration using modified stereo_calib.cpp example. Before call stereoCalibrate I read camera_matrix and distortion_coefficients params for cameras from files, and put these matrices into stereoCalibrate.

FileStorage rgbCamSettings("camera.yml", CV_STORAGE_READ);
Mat rgbCameraMatrix;
Mat rgbDistCoeffs;
rgbCamSettings["camera_matrix"] >> rgbCameraMatrix;
rgbCamSettings["distortion_coefficients"] >> rgbDistCoeffs;

FileStorage irCamSettings("camera_ir.yml", CV_STORAGE_READ);
Mat irCameraMatrix;
Mat irDistCoeffs;
irCamSettings["camera_matrix"] >> irCameraMatrix;
irCamSettings["distortion_coefficients"] >> irDistCoeffs;


Mat cameraMatrix[2], distCoeffs[2];
cameraMatrix[0] = rgbCameraMatrix;
cameraMatrix[1] = irCameraMatrix;
distCoeffs[0] = rgbDistCoeffs;
distCoeffs[1] = irDistCoeffs;

Mat R, T, E, F;



double rms = stereoCalibrate(objectPoints, imagePoints[0], imagePoints[1],
                cameraMatrix[0], distCoeffs[0],
                cameraMatrix[1], distCoeffs[1],
                imageSize, R, T, E, F,
                TermCriteria(CV_TERMCRIT_ITER+CV_TERMCRIT_EPS, 50, 1e-6),
                CV_CALIB_FIX_INTRINSIC +
                CV_CALIB_USE_INTRINSIC_GUESS
                );
DmT021
  • 115
  • 2
  • 9
  • 1
    Two comments: 1. you don't need to resize and crop RGB image to match the dimensions of the IR image, unless you want to use that specific size afterwards 2. this is the code for the calibration part, which seems fine, can you also add the code where you compute the final image that you linked ? – BConic Apr 07 '14 at 10:30
  • The code where I compute the final image is absolutely same as in this example https://github.com/Itseez/opencv/blob/master/samples/cpp/stereo_calib.cpp – DmT021 Apr 07 '14 at 10:59
  • @aldurdisciple Hm, about (1). I tryed it, and now result much better, but still not good. http://i.imgur.com/6iW6f5T.png "done with RMS error=2.55814 average reprojection err = 9.21851" – DmT021 Apr 07 '14 at 11:02
  • OK, how do you detect the image 2D points and how did you define the object 3D points ? How many images did you use, just the pair above or more than one pair ? – BConic Apr 07 '14 at 11:12
  • @AldurDisciple I used 16 pairs of images. I detect points on images by findChessboardCorners with CV_CALIB_CB_ADAPTIVE_THRESH and CV_CALIB_CB_NORMALIZE_IMAGE flags. 3D points defined like this: Point3f(j*squareSize, k*squareSize, 0), where squareSize is 22 mm (as in my printed chessboard), j,k - indexes of corners. My full listing is here: http://pastebin.com/k9F9rXZV. My images is here: http://imgur.com/a/mx6JE – DmT021 Apr 07 '14 at 11:29
  • Does the initial estimation of the camera matrices and distortion coefficients, for both cameras, result in a low error ? – BConic Apr 07 '14 at 11:46
  • @AldurDisciple For RGB: "RMS error reported by calibrateCamera: 0.339427 avg reprojection error = 0.34" For IR: "RMS error reported by calibrateCamera: 0.762042 Calibration succeeded. avg reprojection error = 0.76" – DmT021 Apr 07 '14 at 11:59
  • OK, assuming you made sure that both cameras were _absolutely_ still while you acquired the images, it seems to me that you're doing everything right. I've done this process several times for the calibration of a pair of RGB & IR cameras: this should work the same way as with two normal cameras. One last thing you could try is to generate new pairs of images and try again, as sometimes the calibration process fails due to images which seem perfectly normal. – BConic Apr 07 '14 at 12:01

1 Answers1

4

Can't see your images on Dropbox (why not put your images on stack exchange?), but it seems like the bundle adjustment does not converge. You should try the following:

  1. Calibrate each camera independently using cv::calibrateCamera (link) and get the camera matrix K and distortion coefficients D for each camera.

  2. Estimate the rotation R and the translation T between the two cameras using cv::stereoCalibrate (link) with the estimated K and D and with flags CV_CALIB_USE_INTRINSIC_GUESS and CV_CALIB_FIX_INTRINSIC enabled.

Doing so will decouple the estimation of both camera matrices and distortion coefficients from the estimation of the rotation and translation, which should improve the residual error a lot.

BConic
  • 8,750
  • 2
  • 29
  • 55
  • I did what you suggested. First I calibrated cameras independent using calibration.cpp example. Then I use camera matrices and distortion matrices in stereoCalibrate, with flags you told. But result is still bad: http://i.imgur.com/OVoVaLd.png – DmT021 Apr 07 '14 at 09:33
  • 1
    @DmT021 Can you edit your question and add the code you used to generate this rectified image? – BConic Apr 07 '14 at 09:55
  • Hi @AldurDisciple I am working on a the calibration of kinect one and I am having the same setup as DmT021. Meaning an RGB camera with resolution 1920x1080 and IR/depth camera with resolution 512x424. What I want is to manually calibrate the two cameras in order to be able to map the pixel values of the RGB camera over the depth values in order to create a RGB-D map/image. Noticing that from this thread and others that it seems that you have some experience I followed your suggestions above. – ttsesm Dec 07 '15 at 09:50
  • While the individual calibration it gives me quite good results, i.e. rgb -> ~0.3rms and ir -> ~0.1rms, when I pass the extracted intrinsic/distCoeffs to stereoCalibrate the rms is always over 1, the better I managed was 1.2 and epipolar error around 6. I am trying to figure out what I am doing wrong. Do you have any idea? Moreover, one of the parameters of `stereoCalibrate()` is the image size, my question here is which image size should I use considering that my sensors have different resolutions. – ttsesm Dec 07 '15 at 09:51
  • Now I am using the smaller one 512x424, bear in mind that I am not resizing the 1920x1080 images to 512x424, should I do that. – ttsesm Dec 07 '15 at 09:51