1

I want to calculate the depth error for my system using this formula -

enter image description here

Here, I need to estimate the value of the disparity error before being able to calculate the actual depth error - and the disparity error depends on the stereo-matching algorithm. However, if I am using an active light system (say for example, projecting a laser line or such on the object of interest) will the disparity error be a very small value? It can obviously not be zero because that's not how things work.

r4bb1t
  • 1,033
  • 2
  • 13
  • 36
  • Disparity depends on a lot of things, including, the matching precision, the distance of the object to the camera, camera intrinsic matrix, camera distortion matrix, size of the CCD, size of each pixel on the CCD (mm) and the precision of focal length. with fixed camera, the best way is to calibrate the stereo with a lot of chess board images for calibration the use multiple laser points near to each other for matching the pair of images. – MeiH Jan 04 '20 at 14:31

1 Answers1

0

Disparity error tends to change between scenes irrespective of what ROI you are trying to narrow down. It is purely algorithm dependent, so the algorithm can be evaluated by datasets to get the disparity errors. Those evaluations consider some datasets for which the ground truth disparity will be known. Hence, the disparity error could be determined.

Some of the evaluations can be done by KITTI, Middlebury datasets, which have ground truth for the left/right stereo pairs. http://vision.middlebury.edu/stereo/code/

Madhu Soodhan
  • 160
  • 1
  • 10
  • Most of the algorithms give the disparity in terms of 'pixels' so how can we translate this relative measure to an absolute measure? – r4bb1t Jan 02 '20 at 00:32
  • You can create your own image by just white and black pixels, No complex structures, a simple square will do. Consider it to be the left image. Manually shift some 'n' columns and form the right image. Now feed it to the algorithm, it should give the output disparity as 'n' throughout the image. You can consider any deviations from 'n' as disparity error. This is scene independent and purely algorithm dependent. Eg, if your manual shift is 10 pixels, and disparity from algorithm is 10.5 pixels, error = 0.5 pixels. – Madhu Soodhan Jan 02 '20 at 07:19
  • No, I meant - in terms of say the camera sensor? Because each sensor will have a different size so that means that each pixel for each camera sensor will be different. So the the algorithm performance will vary according to the size of the sensor array? This expression of disparity in pixel terms is still a relative measure right? – r4bb1t Jan 02 '20 at 10:16