5

I have a 3D texture with an internal format of GL_R32UI, writing to it works fine as long as I pretend its a floating point texture.

That is if I bind it as

layout(binding = 0) uniform image3D Voxels;

And write to it with

imageStore(Voxels, coord.xyz, vec4(1));

Everything works exactly as expected. However trying to bind it while specifying the correct type as

layout(r32ui, binding = 0) uniform uimage3D Voxels;

and writing to its with

imageStore(Voxels, coord.zxy, uvec4(1));

doesn't seem to work, that is, nothing gets written to the texture. I'd like to get this work correctly so that I can then use the imageAtomic operations. Anyone have any idea what could be going on?

ragnar
  • 480
  • 3
  • 18
  • If you just want to clear the texture to a value, then clear the texture to a value. Also, how do you detect that "nothing gets written to the texture"? Are you using the proper barriers after writing to the texture? – Nicol Bolas Aug 26 '12 at 09:50
  • I don't want to clear the texture to a value. I want to be able to write whichever value I want wherever I want. For debugging I normally just look at a slice-wise view of the texture. – ragnar Aug 26 '12 at 17:13
  • Right, but *how* do you look at it? Are you using the proper memory barriers to ensure that the data written has been written before you try to read it? – Nicol Bolas Aug 26 '12 at 19:51
  • Well tossing a `glMemoryBarrier(GL_ALL_BARRIER_BITS)` between writing and rendering doesn't change anything, if that answers your question. – ragnar Aug 26 '12 at 22:07
  • Ah, having written to the texture as though it were an unsigned integer texture (which it is), it can no longer be displayed with the same shader code. That is, `sampler3D` had to be changed to `isampler3D` in the display code. Which makes sense, but its strange that it worked at all when treated as a floating point texture, will unfortunately make my display code much less general purpose. Thanks for the help. – ragnar Aug 26 '12 at 22:18

0 Answers0