1

I have an OpenGL 3.2 CORE context on OSX 10.7.5 set up and trying to render to a 3D texture, using a layered rendering approach. The geometry shader feature "gl_layer" is supported, but I cannot bind a GL_TEXTURE_3D to my framebuffer attachment. It returns GL_FRAMEBUFFER_UNSUPPORTED.

This is the card and driver version in my MBP:

AMD Radeon HD 6770M 1024 MB - OpenGL 3.2 CORE (ATI-7.32.12)

This feature does not directly relate to a specific extension AFAIK. Does anybody know how to figure out whether this is unsupported by the driver or hardware? Thanks so much.

Below the code to reconstruct. I use glfw to set up the context:

// Initialize GLFW
if (!glfwInit())
    throw "Failed to initialize GLFW";

glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 2);
glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
glfwOpenWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);

// Open a window and create its OpenGL context
if (!glfwOpenWindow(720, 480, 8, 8, 8, 8, 24, 8, GLFW_WINDOW))
    throw "Failed to open GLFW window";

//
// ...
//

GLuint framebuffer, texture;
GLenum status;
glGenFramebuffers(1, &framebuffer);
// Set up the FBO with one texture attachment
glBindFramebuffer(GL_DRAW_FRAMEBUFFER, framebuffer);
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_3D, texture);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage3D(GL_TEXTURE_3D, 0, GL_RGBA8, 256, 256, 256, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture(GL_DRAW_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, texture, 0);
status = glCheckFramebufferStatus(GL_DRAW_FRAMEBUFFER);
if (status != GL_FRAMEBUFFER_COMPLETE)
    throw status;
//
// status is GL_FRAMEBUFFER_UNSUPPORTED here !!!
//

glBindFramebuffer(GL_DRAW_FRAMEBUFFER, 0);
glDeleteTextures(1, &texture);
glDeleteFramebuffers(1, &framebuffer);
exit(1);
FHoenig
  • 349
  • 1
  • 10

1 Answers1

1

Does anybody know how to figure out whether this is unsupported by the driver or hardware?

It just told you. That's what GL_FRAMEBUFFER_UNSUPPORTED means: it's the driver exercising veto-power over any framebuffer attachments it doesn't like for any reason whatsoever.

There's not much you can do when this happens except to try other things. Perhaps rendering to a 2D array texture.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
  • Looks like neither TEXTURE_2D_ARRAY nor a cube map works. only a 2D texture. This seems odd, because layered rendering is supported by the geometry shader extension which is part of 3.2 CORE. What would be a layered rendering target left then? – FHoenig Oct 28 '12 at 06:24
  • @FHoenig: Cubemaps. Ultimately, the implementation is allowed to veto anything it doesn't like for arbitrary reasons. These things work on Windows and Linux (granted, that's not exactly helpful for you), so it's not a hardware issue. – Nicol Bolas Oct 28 '12 at 07:21
  • @FHoenig: Nothing. As I said, the implementation *does not have to* let you use anything it doesn't feel like letting you use. That's what `GL_FRAMEBUFFER_UNSUPPORTED` is for. Granted, it's not *supposed* to be used this way, but the OpenGL specification allows it. That being said, are you *certain* that you're getting unsupported and not some other enumerator? – Nicol Bolas Oct 28 '12 at 08:00
  • I get unsupported :-/ OpenCL also doesn't have cl_khr_3d_image_writes on this driver. Time to try windows on this mac. – FHoenig Oct 28 '12 at 09:38