0

I am trying out WebGPU for the first time and have come across a problem. Normally in other rendering frameworks I would multiply a vec4 my a matrix of all 1s and the output would be the same.

Here is what I expect with this code:

@vertex
fn vs_main(in: VertexInput) -> VertexOutput {
    var out: VertexOutput;

    let world_position: vec4f = vec4f(in.position, 0.0, 1.0);

    out.position = world_position;
    out.color = in.color;

    return out;
}

the square shows up

However when I try this:

@vertex
fn vs_main(in: VertexInput) -> VertexOutput {
    var out: VertexOutput;

    let world_position: vec4f = mat4x4f(
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0, 
        1.0, 1.0, 1.0, 1.0,
        1.0, 1.0, 1.0, 1.0
    ) * vec4f(in.position, 0.0, 1.0);

    out.position = world_position;
    out.color = in.color;

    return out;
}

it gives me this: nothing shows up

Jove
  • 105
  • 4

1 Answers1

1

Each value in resulting vec4f of mat4x4f * vec4f is the dot product of a row in the mat4x4f and a column in the vec4f.

That is like writing:

vec4f(
    max4x4f[0,0]*vec4f[0] + max4x4f[1,0]*vec4f[0] + max4x4f[2,0]*vec4f[0] + max4x4f[3,0]*vec4f[0],
    max4x4f[0,1]*vec4f[1] + max4x4f[1,1]*vec4f[1] + max4x4f[2,1]*vec4f[1] + max4x4f[3,1]*vec4f[1],
    max4x4f[0,2]*vec4f[2] + max4x4f[1,2]*vec4f[2] + max4x4f[2,2]*vec4f[2] + max4x4f[3,2]*vec4f[2],
    max4x4f[0,3]*vec4f[3] + max4x4f[1,3]*vec4f[3] + max4x4f[2,3]*vec4f[3] + max4x4f[3,3]*vec4f[3]
)

An identity matrix would be:

let world_position: vec4f = mat4x4f(
    1.0, 0.0, 0.0, 0.0,
    0.0, 1.0, 0.0, 0.0, 
    0.0, 0.0, 1.0, 0.0,
    0.0, 0.0, 0.0, 1.0
)
KompjoeFriek
  • 3,572
  • 1
  • 22
  • 35