I've built a 2-D grid in WebGL. It has 5 columns, and 24 rows. Each "cell" in my grid is four vertices, and six indices:
So for each cell, I do:
var addCell = function(x, y, w, h, colorArray) {
len = vertices.length;
vertices.push(x, y, zIndex);
vertices.push(x, y - h, zIndex);
vertices.push(x + w, y - h, zIndex);
vertices.push(x + w, y, zIndex);
Array.prototype.push.apply(colors, colorArray);
Array.prototype.push.apply(colors, colorArray);
Array.prototype.push.apply(colors, colorArray);
Array.prototype.push.apply(colors, colorArray);
indices.push(len, len + 1, len + 2);
indices.push(len, len + 2, len + 3);
}
where x and y are the coordinates of the cell, and "colors" is a four-element array specifying rgba. So I create the first row from left to right, then the second row, and so on. When I create my index buffer, I do:
indexBuffer = gl.createBuffer();
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint8Array(indices), gl.STATIC_DRAW);
And when I draw my scene, I say:
gl.drawElements(gl.TRIANGLES, indices.length, gl.UNSIGNED_BYTE, 0);
When I do this, it only renders the first 12 3/4 rows (of 24). This is because each cell is 4 vertices (adjacent cells do not share vertices). So the first 12+ rows contain: 12.75 * 5 * 4 = 255 <- the max value of GL_UNSIGNED_BYTE.
So to me this was an obvious outcome given my type choice. I thought naturally the solution was to change to gl.UNSIGNED_SHORT and also change my creation of the buffer to:
gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, new Uint16Array(indices), gl.STATIC_DRAW);
However this results in "glDrawElements: attempt to access out of range vertices in attribute 0".
I've stared at my index/vertex-building code for a while trying to see if I'm adding an index that points to a vertex element that doesn't exist. I don't see anything wrong with that code, which makes me think it has something to do with the type I'm choosing. Any direction would be greatly appreciated.