0

I'm developing a camera interface in C#.

I have an HDR camera that generates 24-bit HDR raw images.
The raw image buffer is a byte array, byte[], with 3 bytes per each Bayer tile.

I'm looking for a way to pass this array buffer to the GPU as a texture using OpenTK (a C# wrapper of OpenGL). And then demosaic the the raw image pixels into 24-bit RGB pixels, then tonemap it to 8-bit RGB pixels.

I saw some example code that uses the Luminance pixel format:

GL.BindTexture(TextureTarget.Texture2D, this.handle);
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Luminance, width, height, 0, PixelFormat.Luminance, PixelType.UnsignedByte, imageBuffer);
GL.BindTexture(TextureTarget.Texture2D, 0);

But I'm not sure how the fragment shader would sample this image buffer with this pixel format into a color pixel. Not sure if this is the right way to do it for a 24 bit pixel.

I also tried the option of creating an integer array (int[]) from the byte array image buffer, by combining every 3 bytes into an int on the CPU. But then I run into similar problems passing the integer array into the GPU as a texture. Not sure what pixel format should apply here.

Would anyone be able to point me in the right direction?

Yakov Galka
  • 70,775
  • 16
  • 139
  • 220
  • *"But I'm not sure how the fragment shader would sample this image buffer with this pixel format into a color pixel."* - So why not implement your own fragment shader? – Rabbid76 Nov 30 '21 at 22:07
  • Simplest is to convert your 24-bit values to 32-bit floats, and create a texture with a `GL_R32F` internal format. – Yakov Galka Dec 01 '21 at 19:06

0 Answers0