3

I just spent some hours to isolate a behavior of WebGL which i don't understand.

Let's assume a vertex shader with integer attribute:

precision highp int;
in vec4 vtx_pos;
in vec3 vtx_nrm;
in ivec4 vtx_int; // <- integer attribute

void main() {
   // blah blah...
}

With the following shader attributes binding:

var p = gl.createProgram();

gl.bindAttribLocation(p, 0, "vtx_pos");
gl.bindAttribLocation(p, 1, "vtx_nrm");
gl.bindAttribLocation(p, 2, "vtx_int"); // <- ivec4 on array #2

And finally the following AttribPointer configuration:

gl.vertexAttribPointer(0, 4, gl.FLOAT, false, 28, 0);
gl.enableVertexAttribArray(0);
gl.vertexAttribPointer(1, 3, gl.FLOAT, false, 28, 16);
gl.enableVertexAttribArray(1);
gl.disableVertexAttribArray(2); // <- here is the devil

With this configuration (VertexAttribArray(2) disabled), browsers throws type error for the attribute 2:

Chrome: GL_INVALID_OPERATION : glDrawElements: vertexAttrib function must match shader attrib type

Firefox: drawElements: Vertex attrib 2 requires data of type INT, but is being supplied with type FLOAT.

What I understood is that when the VertexAttribArray is not explicitly enabled with the proper vertexAttribIPointer, WebGL consideres it be "supplied as FLOAT" by default, and so, throws a type error.

What I don't understand is: Why it checks the supplied type of an disabled VertexAttribArray where, logicaly, nothing is supplied ?

Except by enabling the VertexAttribArray as dummy, is there some magic to avoid this error ?

1 Answers1

3

Quoting GL ES 3.0 §2.8:

The resulting attribute values are undefined if the base type of the shader attribute at slot index is not floating-point (e.g. is signed or unsigned integer).

WebGL 2.0 eliminates undefined behaviour here and mandates that an error should be produced.

So, by default values of non-array attribute is indeed a floating point vector. If the actual attribute in a shader is an integer on unsigned integer value, you should specify its value manually via a call from vertexAttribI4* family, i.e.:

gl.vertexAttribI4i(2, 0, 0, 0, 0);
Kirill Dmitrenko
  • 3,474
  • 23
  • 31
  • Thanks ! now i wonder if it is preferable to have an `ivec4` attribute which requier some ad-hoc process, or if I should keep it to a standard `vec4`, and then casts float value into integer within the shader... (the ivec4 is used for array subscripts) –  Sep 27 '17 at 16:24
  • @Sedenion If precision isn't a problem, than I believe there's no technical reasons to choose one instead of the other. However, implementation with `ivec4` may be somewhat cleaner:) – Kirill Dmitrenko Sep 27 '17 at 16:27