8

I'm working on an OpenCV project, and I'm on to calibration. I believe I've implemented the code correctly; however I'm getting different values for the camera matrix, sometimes wildly varying. After 6 repetitions of showing the calibration pattern 10 times, I get (decimals truncated for clarity):

[573, 0,  386;
  0, 573, 312;
  0,  0,   1]

[642, 0,  404;
  0, 644, 288;
  0,  0,   1]

[664, 0,  395;
  0, 665, 272;
  0,  0,   1]

[629, 0,  403;
  0, 630, 288;
  0,  0,   1]

[484, 0,  377;
  0, 486, 307;
  0,  0,   1]

[644, 0,  393;
  0, 643, 289;
  0,  0,   1]

These values differ by unacceptable amounts. I need to know to a decent degree of accuracy what the given parameters are. What is typically the cause of these large inaccuracies and how can I evaluate the correctness of a given matrix? It seems to depend on the variety of distances and orientations I show the pattern from but I can't make sense of the pattern.

Code:

using namespace cv;
using namespace std;

int main(int, char**)
{
    VideoCapture cap(1);
    if(!cap.isOpened())
        return -1;

    cap.set(CV_CAP_PROP_FRAME_WIDTH,800);
    cap.set(CV_CAP_PROP_FRAME_HEIGHT,600);
    Mat edges;
    Size size(9,17);

    int counter = 10;

    vector<Point2f> corners;
    bool found;

    vector<Point3f> chess = fr::ChessGen::getBoard(size,1,true);

    vector<vector<Point3f> > objectPoints;
    vector<vector<Point2f> > imagePoints;

    Mat camera = Mat::eye(3,3,CV_64F);
    Mat distortion = Mat::zeros(8, 1, CV_64F);
    vector<Mat > rvecs;
    vector<Mat > tvecs;

    namedWindow("edges",1);
    for(;;)
    {
        Mat frame;
        cap >> frame;
        cvtColor(frame, edges, CV_BGR2GRAY);

        found = findCirclesGrid(edges,size,corners
                                ,CALIB_CB_ASYMMETRIC_GRID
                                );
        if(found) frame.convertTo(edges,-1,0.2);

        drawChessboardCorners(edges,size,corners,found);

        imshow("edges", edges);
        if(found){
            if(waitKey(200)>=0){
                objectPoints.push_back(chess);
                imagePoints.push_back(corners);
                if(--counter<= 0)
                    break;
            }
        }
        else waitKey(30);
    }

    calibrateCamera(objectPoints,imagePoints,Size(800,600),camera,distortion,rvecs,tvecs,0);

    if(found) imwrite("/home/ryan/snapshot.png",edges);

    cout << camera << endl;

    return 0;
}
Ryan Kennedy
  • 3,275
  • 4
  • 30
  • 47
  • What reprojection error you get in each case? – Sassa Nov 30 '12 at 18:00
  • There is a sample code for calibration with documentation http://docs.opencv.org/doc/tutorials/calib3d/camera_calibration/camera_calibration.html and also take care of what Martin Beckett said. – chipmunk Dec 02 '12 at 18:49
  • Could you post some of the image sets you used? Just to make sure we have the right picture... – tjltjl Dec 06 '12 at 16:53
  • The images were taken from a live camera stream. In each frame, the program would search for the board. If it found it, the counter would decrement and the image would be added to the list of calibration images. If the counter reached zero, the calibration would run. So that is one drawback. If I moved the board slow enough that it did not blur between positions, then all 10 positions could potentially be very close to one another. – Ryan Kennedy Dec 06 '12 at 21:59
  • What does this line do? vector chess = fr::ChessGen::getBoard(size,1,true); and what is the namespace fr? I can't find that anywhere. – Pedro Batista Jan 04 '17 at 16:47
  • It's a utility function that generates a vector of chessboard coordinates: `fr::ChessGen::getBoard(2, 1, true) => [(0,0) (0,1) (0,2) (1,0) (1,1) (1,2) (2,0) (2,1) (2,2)]`. My bad; I thought I'd entirely generalized this code before posting. – Ryan Kennedy Jan 04 '17 at 19:07

4 Answers4

5

Depends on the camera/lens and the accuracy you require, but you probably need more than 10 positions and you need to cover a wider range of view angles.

I'm assuming from the 800x600 that this is a webcam with a simple wide angle lens with lots of distortions. I would say you need 6-8 positions/rotations of the target in each of 3-4 different angles to the camera. You also need to make sure that the target and camera are fixed and don't move during an image. Again assuming the camera has simple autogain you should ensure the target is very well lit so it will use a fast shutter speed and low gain.

One issue with the technique used by openCV is that it needs to see all the corners/dots on the target for a frame to be identified and used in the solution - so it's quite hard to get point near the corners of the image. You should check the data for the number of images actually used in calibration - it maybe that it's only finding all the points on a few of the 10 images and basing the solution on that subset.

Martin Beckett
  • 94,801
  • 28
  • 188
  • 263
3

It is also important not to take only patterns perpendicular to the camera, but to rotate them. To improve results quality you can also check closely the position of the detected corners, remove the pictures where some corners were not correctly detected and run the algorithm again.

I don't know which camera you're using, but with cameras suffering from great distortion and not sharp enough the corners can become hard to detect correctly. OpenCV calibration can also be realised with a circle pattern which gives better results in this case.

Étienne
  • 4,773
  • 2
  • 33
  • 58
  • 2
    You wrote: *"check closely the position of the detected corners, remove the pictures where some corners were not correctly detected and run the algorithm again"*. This is a **must**, OpenCV's algorithm minimizes the sum of squared reprojection errors and it is not robust to outliers so wrong detected points in an image will produce errors in the result (specially if there are few images). – Milo Feb 08 '13 at 10:12
1

From my experience you should calibrate using undistorted images, with the function undistort() provided by OpenCV.

That means you run calibration twice, first time to determine the lens coeffictions. Then undistort each chessboard frame in the second run. The focal lengths fx and fy become more precise using undistorted calibration frames.

YuZ
  • 445
  • 5
  • 18
0

the proper angle is below 45 degree according to Zhengyou Zhang's paper,and there must be more than 6 pictures,you'd better take 20 for them. Also,you need to attention the light balance,intensity.

michael_stackof
  • 223
  • 3
  • 12