I have an int buffer of intensity values, I want to display this as a greyscale/colour-mapped image in OpenGL.
What is the best way to achieve this?
Standard Texture?
Can I do it via a standard glTexture, so something like:
gl.TexImage2D(OpenGL.GL_TEXTURE_2D, 0, OpenGL.GL_R32f, width, height, 0, OpenGL.GL_RED_INTEGER, OpenGL.GL_UNSIGNED_INT, pixels);
In the shader I am under the impression I would use it the same as any other texture except I would use usampler2D instead of sampler2D, at which point I would get the true integer value (i.e. not 0-1 range).
TBO?
Or would it be better to achieve with a TBO and do something like:
gl.TexBuffer(OpenGL.GL_TEXTURE_BUFFER, OpenGL.GL_R32F, bufferID);
In terms of the shader I am actually quite confused. I have seen things like g = texelFetch(u_tbo_tex, offset + 1).r.. So I am guessing I would have to translate the texture coordinates into an offset, something like:
int offset = tex_coord.s + (tex_coord.t * imageWidth);
but then texelFetch actually returns a vec4, so presumably I would use:
int intensity = texelFetch( buffer, offset).r
But then as tex_coord.s & t are in 0-1, that would imply the need to:
int offset = tex_coord.s*imageHeight + ((tex_coord.t * imageWidth) * imageWidth);
Other Buffer
I have very little experience with buffer objects I feel like really all I am doing is using a buffer in GL....so I do feel like I am over complicating it and I am missing the "penny drop".
Important Notes
- Why Int? : In some cases I do some manipulation on the data before turning into a colour and would prefer to do this at 32 bit precision to avoid potential precision errors. Arguably it might not make a difference as it eventually becomes a screen color...
- Data update frequency: the intensity data is updated occasionally by user events but certainly not multiple times per frame (so I am presuming STATIC is more appropriate then DYNAMIC in this case?)
- Use: The data is mainly for GL so _DRAW There is the possibility that the application could make use of GL to compute some values for it but I would probably create a separate READ buffer in this case
- The highest integer value I have seen so far is "90,000" so I know it goes out of the 16 bit integer range.
Note: I am doing this through SharpGL and I have been unable to test at the moment as it has no definition for GL_R32f, so I shall have to find the gl.h on my windows platform (always fun) and add the correct const number*