1

I have a point cloud from a Leica range scanner and a set of images from the same indoor scene taken by a camera. is there any way I can map the pixel information from images onto the 3D scanned data?

Is it possible to perform the measurements in a specific way to be able to do so?(measuring the relative angles and positions of the two devices at the time of scanning/capturing?

Nima
  • 45
  • 1
  • 6
  • If I understood correctly, You want to project colors from image to point cloud of the same scene. It's fairly simple. You've got depth map from your scanner, where pixel value represents distance from the center of camera and You've got image where pixel values represent colors. All You've got to do is to display depth map as point cloud and set color of each point to the same value as color of related pixel in RGB image. – MASTER OF CODE Feb 18 '22 at 08:04
  • Thanks for your response. I actually asked this for situations where we have pointcloud from one camera, and RGB from another, but from the same indoor scene. for example we have a point cloud data and RGB data from laser 3d scanner, and we take HDR pictures of the same indoor scene from slightly different angle, which includes RGD data, and radiance data. my purpose is to find a correspondence between radiance values from HDR image, with XYZ values from point cloud data. – Nima Feb 18 '22 at 13:43

0 Answers0