1

given

std::vector<GLuint> cubeIndices;
struct FaceGroup {
    unsigned int face_index;
    unsigned int start_index;
    size_t length;
    // comparison operators omitted
};
std::set<FaceGroup>::iterator i;
GLuint attribIndex;

I was rendering each FaceGroup in the set by looping through each index in cubeIndices from start_index to start_index + length like so:

for (unsigned ix = 0; ix < i->length; ++ix) {
    glDisableVertexAttribArray (attribIndex);
    glVertexAttribI1ui (attribIndex, cubeIndices [i->start_index + ix]);
    glDrawElements (GL_TRIANGLE_STRIP, 4, GL_UNSIGNED_INT, (void*) (i->face_index * sizeof (GLuint)));
}

... which gives me the correct result. Now I want to render the same thing using instanced arrays. My reasoning tells me that the following code is equivalent to the loop above:

glEnableVertexAttribArray (attribIndex);
glVertexAttribDivisorARB (attribIndex, 1);
glVertexAttribPointer (attribIndex, 1, GL_UNSIGNED_INT, GL_FALSE, sizeof (GLuint), &cubeIndices [i->start_index]);
glDrawElementsInstancedARB (GL_TRIANGLE_STRIP, 4, GL_UNSIGNED_INT, (void*) (i->face_index * sizeof (GLuint)), i->length);

but it seems to only render the first in each group of faces (*tentative analysis, I may be wrong). What am I doing wrong?

larvyde
  • 811
  • 1
  • 6
  • 19

1 Answers1

1
glVertexAttribI1ui (attribIndex, cubeIndices [i->start_index + ix]);
glVertexAttribPointer (attribIndex

These do not do the same thing.

It's strange that you knew an esoteric function like glVertexAttribI*, but don't know that you need to use glVertexAttribIPointer when creating integral arrays. If you use glVertexAttribPointer, you're transferring floating-point values; any integer values provided will be converted into floats (which is why glVertexAttribPointer has a parameter that tells whether it is normalized).

So you should use glVertexAttribIPointer. Unless you changed your shader to use a float instead of a uint for attribIndex's input.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982
  • THANK YOU!!! it works now... :) in my defense, I only found out about `glVertexAttribI1ui` because I initially tried `glVertexAttrib1i`, which does not exist, which then leads me to the `glVertexAttrib` man page, where I found the `glVertexAttribI*` functions. – larvyde Sep 18 '12 at 04:15