4

I have two (identical) shaders, one in hlsl and one in glsl. In the pixel shader, I am multiplying a vector by a matrix for normal transformations. The code is essentially:

HLSL

float3 v = ...;
float3x3 m = ...;
float3 n = mul(v, m);

GLSL

vec3 v = ...;
mat3 m = ...;
vec3 n = v * m;

This should do a row vector multiplication, yet in glsl it doesn't. If I explicitly type out the algorithm, it works for both. Both the glsl and hlsl spec, from what I can tell, says they should do a row vector multiply if the vector is on the left hand side, which it is.

The other confusing thing is that I multiply a vector by a matrix in the vertex shader with the vector on the left, yet that works fine in both glsl and hlsl. This leads me to guess that it is only an issue in the fragment/pixel shader.

I pass the matrix from the vertex shader to the fragment shader using:

out vec3 out_vs_TangentToWorldX;
out vec3 out_vs_TangentToWorldY;
out vec3 out_vs_TangentToWorldZ;

out_vs_TangentToWorldX = tangent * world3D;
out_vs_TangentToWorldY = binormal * world3D;
out_vs_TangentToWorldZ = normal * world3D;

and in the fragment shader I reconstruct it with:

in vec3 out_vs_TangentToWorldX;
in vec3 out_vs_TangentToWorldY;
in vec3 out_vs_TangentToWorldZ;

mat3 tangentToWorld;
tangentToWorld[0] = out_vs_TangentToWorldX;
tangentToWorld[1] = out_vs_TangentToWorldY;
tangentToWorld[2] = out_vs_TangentToWorldZ;
Programmdude
  • 551
  • 5
  • 22
  • How are you passing the matrix to the pixel shader? – CynicismRising Aug 11 '14 at 04:07
  • in/out attributes from the vertex shader – Programmdude Aug 11 '14 at 04:11
  • Is it a per vertex matrix? If each vertex outputs a different matrix the hw would try to interpolate it across each triangle, probably incorrectly. Is it possible to pass it as a uniform? – CynicismRising Aug 11 '14 at 04:21
  • It is a per vertex matrix, but its also a per vertex matrix in hlsl too. That wouldn't explain why writing the algorithm for row vector multiplication works, yet doing the v * m command itself doesn't. – Programmdude Aug 11 '14 at 07:07
  • Did you tried to swap the operands in GLSL? I'm not sure anymore, but I thought, that I read some time ago, that HLSL is working with row-major matrices and GLSL with col-major or vice versa. – Gnietschow Aug 11 '14 at 07:52
  • That works, but it's strange because I have checked and both hlsl and glsl are using column major matrices. If I swap all the matrix multiplications it breaks, it only works if I swap just the pixel shader mul. – Programmdude Aug 11 '14 at 08:22
  • How do you pass and build the matrix from the vertexshader to the pixelshader? – Gnietschow Aug 11 '14 at 09:12
  • I updated the answer with the code that passes the matrix. – Programmdude Aug 11 '14 at 10:05
  • in glsl mat[0] represents the first column, in hlsl it represents the first row. http://www.opengl.org/wiki/Data_Type_(GLSL)#Matrices http://msdn.microsoft.com/en-us/library/windows/desktop/bb509634(v=vs.85).aspx#Matrix – CynicismRising Aug 12 '14 at 07:10

1 Answers1

5

HLSL matrices are row-major, GLSL are column-major. So if you pass your matrix into GLSL shader using the same memory layout as you pass it into HLSL, then your HLSL rows will become GLSL columns. And you should use column-major multiplication in your GLSL shader to get same effect as in HLSL.

Just use

vec3 n = m * v;
Ryhor Spivak
  • 253
  • 1
  • 8
  • I was passing them in correctly, but I wasn't aware that hlsl's array accessor was still row major, even when compiling as column major. – Programmdude Aug 16 '14 at 08:12