1

I have a setup of a hybrid stereo system. One camera is 3D ToF from which I can export depth map or greyscale point cloud and the other camera is PI V2 RGB camera. I have done stereoCalibration using Matlab stereo calib tool.

  1. How can I estimate a disparity map between the two cameras using the depth map or the point cloud? Or how to properly align the 2 scenes together so that in the result I get a point cloud with its corresponding colors?

Right now I have tried to use this code:

%% Depth map
z = double(depth(:,:,2))/255;
[x,y] = ndgrid(1:size(depth,1), 1:size(depth2,2));
scene_tran=imtranslate(scene,[-15, 40],'FillValues',255);
c = double(scene_tran)/255;
%c = flip(c,1);
%c = flip(c,2);
figure(1);
scatter3(flip(x(:)), y(:), z(:), 6, reshape(c,[],3), '.')
axis vis3d
cameratoolbar('Show')
cameratoolbar('SetMode','orbit')
xlabel('x');ylabel('y');zlabel('z');
view([0,0,-90])
camroll(90);


%% Point cloud
figure(2);
cl1.Color = reshape(permute(scene, [2 1 3]), [], 3);
pcshow(pcdenoise(cl1))
view([0,0,-90])
xlabel('x');ylabel('y');zlabel('z');

cl1 is the point cloud. I tried to align the scene by just translating the RGB scene, but I know that not a precise way to do it. Any help will be appreciated

Regards Kuchx

Kuchx
  • 21
  • 3
  • The disparity map is basically a depth map. Disparity maps are computed in applications where you dont have an extrinsic calibration so you can not know distances, so you see how far similar pixels are from one camera to the other. If you do have stereo calibration, then a depth map is the ultimate disparity map. – Ander Biguri Apr 11 '18 at 10:06

0 Answers0