0

I'm wondering why I can't initialize an array with an integer index. In shadertoy it seems to work but it doesn't work when I use this pixel shader via three.js:

void main(void) {
    vec2 p[1];
    p[0] = vec2(0.0, 0.0); // works

    int i = 0;
    p[i] = vec2(0.0, 0.0); // doesn't work glsl doesn't run

    gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

Any ideas?

Shai UI
  • 50,568
  • 73
  • 204
  • 309
  • 1
    In GLSL 1.00, the array index has to be a constant-index-expressions or a loop index. See [OpenGL ES Shading Language 1.00, 5 Indexing of Arrays, Vectors and Matrices](https://www.khronos.org/registry/OpenGL/specs/es/2.0/GLSL_ES_Specification_1.00.pdf#page=115), page 109 – Rabbid76 Jun 10 '19 at 19:25
  • @Rabbid76 thanks but why does shadertoy.com work with this code then? is there any way to make my glsl a higher version using three.js? – Shai UI Jun 10 '19 at 20:07
  • I'm not familiar with shader toy. Possibly it uses [WebGL 2.0](https://www.khronos.org/registry/webgl/specs/latest/2.0/) and [GLSL ES 3.00](https://www.khronos.org/registry/OpenGL/specs/es/3.0/GLSL_ES_Specification_3.00.pdf#page=150) – Rabbid76 Jun 10 '19 at 20:12

1 Answers1

2

The issue is GLSL 1.0 only supports constant integer expressions for array axis or loops based on constant integer expressions.

See the spec

void main(void) {
    vec2 p[1];
    p[0] = vec2(0.0, 0.0); // works

    int i = 0;
    p[i] = vec2(0.0, 0.0); // doesn't work. i is not constant

    const int j = 0;
    p[j] = vec2(0.0, 0.0); // works

    vec2 q[2];
    for (int k = 0; k < 2; ++k) {  // 2 is a constant int so this works
       p[k] = vec2(0); // works
    }

    gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);
}

Note that the rules are complex. For example your code is ok in a vertex shader but not in a fragment shader. Except for arrays of samplers even in vertex shaders the index must follow the same restricted rules.

WebGL2 supports GLSL ES 3.00 which allows non-constant integer array access in more places.

Shadertoy optionally uses WebGL2 though it tries to do it auto-magically. You don't have to tell it your shader is using GLSL ES 3.0, it just guesses some how. Maybe it compiles the shader both ways and whichever one works is the one it uses. I have no idea, I just know it does support both.

THREE.js has a WebGL2 version

gman
  • 100,619
  • 31
  • 269
  • 393