I have a GLSL fragment shader source that is producing correct results on half of our machines and incorrect (but consistent) on the other half.
I basically have two matrices or arrays that I am adding. Data is mapped to 2D textures of size 7x9. Both inputs are of indentical size of 60 float32's. The closest texture size that my code is finding is 7x9.
The following script produces correct results on all machines. TexCoords is my varying:
glFragColor = texture2D(A, TexCoords) + texture2D(B, TexCoords);
However the following (simplified) version does not. Which indicates either my logic of mapping coordinates to offsets and back is incorrect or something more mysterious is at play here:
const float xScale = 7.0; // x size of the texture
const float yScale = 9.0; // y size of the texture
float s = TexCoords.s * xScale; // denormalize x
float t = TexCoords.t * yScale; // denormalize y
int offset = int(t) * 7 + int(s); // flattened offset from 0
s = mod(float(offset), 7.0); // recalc x from offset
t = floor(float(offset) / 7.0); // recalc y from offset
vec2 coords = vec2(s / xScale, t / yScale); // normalize
glFragColor = (texture2D(A, coords) + texture2D(B, coords);
Example of correct results (truncated)
EXPECTED: type=float32; dims=[3,4,5]; data=[1.0915919542312622,0.04060405492782593,0.16559171676635742
Example of incorrect one:
ACTUAL: type=float32; dims=[3,4,5]; data=[1.0915919542312622,1.0915919542312622,0.04060405492782593,...