0

I have a realsense camera and I'm trying to read a 16 bit depth image. I have a problem - when I create direct opencv mat with 16 bit values I see only black image. When I convert 16 bit image to 8 bit with 255/1000 scale, I get normal image, but I don't want to lose that information.

depthImage->AcquireAccess(PXCImage::ACCESS_READ, PXCImage::PIXEL_FORMAT_DEPTH, &depthImgData);

pxcBYTE* cpixels = depthImgData.planes[0];

Mat r_depth(frameSize, CV_16UC1, cpixels);

Here's my code, where I convert image.

Could you please tell me what is the reason of such behaviour and how to get 16 bit image?

  • "When I convert 16 bit image to 8 bit with 255/1000 scale" -- That's a very odd scaling factor for that kind of conversion. What is the actual range of pixel intensities in your 16 bit image? If just dividing by ~4 gives you a "normal image", then it seems more like 10 bit values stored as 16 bit integers. – Dan Mašek Jul 14 '17 at 14:35
  • I took that from blog entry. Values range from 4 to 7k (in my train set). There is same scaling factor here - https://software.intel.com/en-us/articles/using-librealsense-and-opencv-to-stream-rgb-and-depth-data. I don't think that values are only 10 bit. Also, I get more clear image when I convert to 16 bit with smaller scaling factor(1/4 approx. this is not exact). – Артем Лян Jul 22 '17 at 07:02

1 Answers1

1

which RealSense device do you use? R200, LR200, SR300 or D410?

Suggest to use the librealsense library/APIs which is maintained on GitHub. Configure the depth format to RS2_FORMAT_Z16. And, I use below code to get 16bit depth data.

auto frame = queue.wait_for_frame();

Mat depth16(Size(640, 480), CV_16U, (void*)frame.get_data(), Mat::AUTO_STEP);
Freeman Lo
  • 21
  • 2