I am working on a project for which I have to compute the screen coordinates of a given world point.
I read some material and organized this task into four subtasks:
- Multiply the world point with the inverse TRS-matrix of my camera to receive the point in camera coordinates.
Multiply the point in camera coordinates with the projection matrix to receive the point in clipping coordinates. The projection matrix looks like this
(1 / tan (fov * 0.5 * PI / 180)), 0, 0, 0; 0, (1 / tan (fov * 0.5 * PI / 180)), 0, 0; 0, 0, - ((f + n) / (f - n)), - (2 * f * n / (f - n)); 0, 0, -1, 0;
- Multiply the point in clipping coordinates with 1/w to receive the point in ndc.
- Convert ndc into screen coordinates.
This is working fine, but I came across some strange results and errors. It turns out that the w value of the point in clipping coordinates turns to 0 if the z value of point in camera coordinates is 0. This results in a division by 0 in step 3.
Did I overlook something or is this normal? Is there a way to compute the screen position of points on the cameras z-axis or should I just check if (z == 0)
and return the middle of the screen?