1

I have a plane whose origin (ABC) and surface normal are defined in terms of a standard Cartesian coordinate system, XYZ. The plane is also constrained such that the line connecting the origin of the plane's coordinate system and the origin of the XYZ reference frame shall be defined as the x-axis of the plane's coordinate system.

I have the 2D coordinates of a point on that plane (a, b). How do I compute the coordinates of that point in terms of the XYZ reference frame?

Thom DeCarlo
  • 159
  • 2
  • 12

1 Answers1

1

You just need two orthogonal vectors to define your 2D space. You already have one, as you said, as the vector from the plane point, P, to the origin. To get the other, take the cross product of that one with the normal vector of the plane.

u = normalize(planePoint)
v = normalize(cross(planeNormal, u))
point = u * x + v * y
FogleBird
  • 74,300
  • 25
  • 125
  • 131
  • Nice! I'll try that. It's certainly a lot simpler than the rabbit hole of quaternians and Euler angles that I was stuck in. – Thom DeCarlo Mar 02 '17 at 18:03