When showing the extrinsic parameters of calibration (the 3D model including the camera position and the position of the calibration checkerboards), the toolbox does not include units for the axes. It seemed logical to assume that they are in mm, but the z values displayed can not possibly be correct if they are indeed in mm. I'm assuming that there is some transformation going on, perhaps having to do with optical coordinates and units, but I can't figure it out from the documentation. Has anyone solved this problem?
-
Could it be pixel? What did you find in [the documentation](http://www.vision.caltech.edu/bouguetj/calib_doc/htmls/parameters.html)? Why is this tagged **python**? – Schorsch Aug 02 '13 at 17:25
-
that was a mistake, I removed the tag. The documentation only talks of pixels, and my linear algebra is too weak to understand what's going on. I do think it could be pixels - pixel size is constant and determined by a camera's sensor, correct? Not sure yet how to relate this to distances though. – blaughli Aug 02 '13 at 17:28
-
I don't think that it's pixels. My z value is 990, and I've got 5.5 microns per pixel in the sensor. By my calculations, that converts to z = 5.4cm, which is too close. The object was at around 50 cm from the sensor. If I had z =54 cm, I'd be more hopeful - any ideas? – blaughli Aug 02 '13 at 19:32
2 Answers
If you marked the side length of your squares in mm, then the z-distance shown would be in mm.

- 53
- 3
- 6
I know next to nothing about matlabs (not entirely true but i avoid matlab wherever I can, and that would be almost always possible) tracking utilities but here's some general info.
Pixel dimension on the sensor has nothing to do with the size of the pixel on screen, or in model space. For all purposes a camera produces a picture that has no meaningful units. A tracking process is unaware of the scale of the scene. (the perspective projection takes care of that). You can re insert a scale by taking 2 tracked points and measuring the distance between those points. This is the solver spaces distance is pretty much arbitrary. Now if you know the real distance between these points you can get a conversion factor. By doing:
real distance / solver space distance.
There's really now way to knowing this distance form the cameras settings as the camera is unable to differentiate between different scales of scenes. So a perfect 1:100 replica is no different for the solver than the real deal. So you must allays relate to something you can measure separately for each measuring session. The camera always produces something that's relative in nature.

- 4,354
- 1
- 27
- 45
-
Hi, thanks for your thoughts. I came to the same conclusion, and I'll be taking some more images with an object of known dimensions. The strange thing is that the provided triangulation function accurately determines the distance between points in my images. Unfortunately, there seems to be a scale factor that's off (I think it's `focal length/distance`). Maybe this is only solvable with some real measurements. – blaughli Aug 05 '13 at 22:08