5

I have two rotation matrices. One is that of the rotation matrix of a real webcam which I got by solving the PnP problem. I have a world coordinate frame and I know the locations of each and every point in world coordinates in the world space.

As far as my understanding, a rotation matrix transforms points in world coordinates to camera frame coordinates (not considering translation here). This means that, R1 gives you the orientation of world coordinate frame with respect to camera coordinate frame.

My second rotation matrix is that of a sensor which is also in world coordinates. ie, this rotation matrix gives you the orientation of world coordinate frame with respect to sensor coordinate frame.

I want to find the orientation of real webcam coordinate frame with respect to sensor coordinate frame.

Lets name the first rotation matrix Rw1 and the second rotation matrix Rw2, with the subscripts w1 denoting world with respect to real webcam and w2 denoting world with respect to sensor (1 and 2 can be considered denoting real webcam and sensor respectively).

So my aim is to find R12. (and not R21)

R12 = R1w * Rw2 = (Rw1)' * Rw2

I am assuming that this R12 remains constant throughout (in the subsequent frames of the video) because sensor and webcam position will not be disturbed relative to each other and they always move together. Is my assumption valid?

If it is valid, then my ultimate objective is to compute the rotation matrix of real webcam in the subsequent frames. I can calculate the rotation matrix of sensor in subsequent frames which is Rw2 for the subsequent frames. I have to find out Rw1 and I can not use any PnP algorithms. I want to compute it from the currently available information.

Let's consider the second frame for now.

I know R12 (which I am assuming is constant and I computed it in first frame) and Rw2 (sensor rotation matrix for the second frame). I have to find Rw1 for the second frame.

Rw1 = Rw2 * R21 = Rw2 * (R12)'

Is my method correct?

PS: (R)' means the transpose of R.

j1897
  • 1,507
  • 5
  • 21
  • 41
  • If you use row vectors multiplied on the left of your matrices, I think your formulas are correct. – BConic Sep 13 '14 at 16:32
  • I don't understand what you mean by row vectors in this context. R is a 3*3 matrix. All matrices are 3*3 in this context. – j1897 Sep 15 '14 at 09:27
  • Considering a rotation matrix `R` and a 3D point represented by the 1x3 column vector `M=[X, Y, Z]`, do you compute the rotated point as `M.R` or `R.M'` ? In the first case, your formulas are OK, otherwise you need to transpose everything. – BConic Sep 15 '14 at 09:41
  • I compute it as R.M' which will yield yield column vector of size 3 – j1897 Sep 15 '14 at 13:07
  • Actually, transposing everything is not enough to correct your formulas. See my answer for a detailed explanation. – BConic Sep 15 '14 at 18:31

1 Answers1

6

When working with rotation matrices, you have to be extra careful about the source coordinate frame and the destination coordinate frame.

Considering two coordinate frames R1 and R2, you can denote the rotation matrix transforming a point MR1, expressed in R1, to the corresponding point MR2, expressed in R2, by RR2<-R1 such that :

MR2 = RR2<-R1 * MR1

This notation is very useful and has two nice properties :

(RR2<-R1)-1 = RR1<-R2

RR3<-R2* RR2<-R1 = RR3<-R1

Now, with this in mind you should get the answers to your question quite easily. Let's use the following notations regarding your particular problem:

R01 = R0cam<-world : rotation matrix world coordinates to camera coordinates at frame 0

R02 = R0sensor<-world : rotation matrix world coordinates to sensor coordinates at frame 0

Rt1 = Rtcam<-world : rotation matrix world coordinates to camera coordinates at frame t

Rt2 = Rtsensor<-world : rotation matrix world coordinates to sensor coordinates at frame t

First, you want to find the rotation matrix transforming points in the camera coordinate frame to points in the sensor coordinate frame (i.e. "orientation of real webcam coordinate frame with respect to sensor coordinate frame").

R012 = R0sensor<-cam = R0sensor<-world * R0world<-cam = R02 * (R01)-1

Then, you want to find the rotation matrix of the web cam in subsequent frames, knowing the rotation matrix of the sensor for this frame and assuming Rt12 = R012.

Rt1 = Rtcam<-world = Rtcam<-sensor * Rtsensor<-world = (Rtsensor<-cam)-1 * Rtsensor<-world = (R012)-1 * Rt2

BConic
  • 8,750
  • 2
  • 29
  • 55
  • great answer. I find [this link](https://www.gamedev.net/forums/topic/654346-calculate-relative-rotation-matrix/) may be helpful w.r.t. the rotation of raw/column vectors – zheyuanWang Mar 25 '21 at 03:45
  • Isnt this answer wrong? R⁰12 would only work for that specific rotation of frame 0. It would not work on other Rotations right? – Oroshimaru Jul 20 '21 at 01:43
  • Remember that Rt_12 = Rt_sensor<-cam, i.e. the transform from camera to sensor coordinate, in other words extrinsics between two coordinate frames attached to the camera body. A priori, assuming this is a constant wrt t is reasonable. For Rt_cam<-world, this would indeed not be the case. – BConic Jul 21 '21 at 05:11
  • Anyway, the point of the answer is to recommend ultra clear notations when dealing with rotations / coordinates transforms, your question is additional proof that this is important. – BConic Jul 21 '21 at 05:25
  • @BConic Oh ok thank you! One last question regarding the notation. If I know the Orientation of the camera Coordinate system to the World. That Would be a Rotation matrix that rotates from Camera to World: Rwc ? – Oroshimaru Jul 22 '21 at 22:45