0

I am trying to write depth to a texture. I would like to have the linear depth, so I tried using I tried using a R16F texture. I defined a texture like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_R16F_EXT, g_bufferWidth, g_bufferHeight, 0, 
             GL_RED_EXT, GL_HALF_FLOAT_OES, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
                       GL_TEXTURE_2D, g_texture, 0);    
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, 
                      g_bufferWidth, g_bufferHeight);

But when debugging on Xcode by using frame capture on an iPhone5, I get an Unknown texture in the color buffer, and nothing is written to the depth buffer.

I've also tried just creating a depth texture:

glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT,  g_bufferWidth, g_bufferHeight, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_INT, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, g_texture, 0);

But in this case also nothing seems to get written in the depth buffer.

The only way I can get things rendered to the depth buffer seems by defining the first texture as RGBA32...

Aren't the EXT_color_buffer_half_float and depth extensions active in iOS6??

endavid
  • 1,781
  • 17
  • 42
  • I've run into the same situation now. Did you found any solution for your problem? – SMP Nov 20 '12 at 14:01
  • 1
    I couldn't get GL_R16F_EXT to work... So finally I used a RGBA32 texture. I didn't encode the depth the usual way because I wanted to use linear interpolation on the texture, so I encoded 256 values per channel linearly, so at most you can encode 1024 values... – endavid Nov 29 '12 at 00:32
  • Sounds good to me. Thank you for that work around =) – SMP Nov 29 '12 at 14:51

0 Answers0