I have two rotation matrices. One is that of the rotation matrix of a real webcam which I got by solving the PnP problem. I have a world coordinate frame and I know the locations of each and every point in world coordinates in the world space.
As far as my understanding, a rotation matrix transforms points in world coordinates to camera frame coordinates (not considering translation here). This means that, R1 gives you the orientation of world coordinate frame with respect to camera coordinate frame.
My second rotation matrix is that of a sensor which is also in world coordinates. ie, this rotation matrix gives you the orientation of world coordinate frame with respect to sensor coordinate frame.
I want to find the orientation of real webcam coordinate frame with respect to sensor coordinate frame.
Lets name the first rotation matrix Rw1 and the second rotation matrix Rw2, with the subscripts w1 denoting world with respect to real webcam and w2 denoting world with respect to sensor (1 and 2 can be considered denoting real webcam and sensor respectively).
So my aim is to find R12. (and not R21)
R12 = R1w * Rw2 = (Rw1)' * Rw2
I am assuming that this R12 remains constant throughout (in the subsequent frames of the video) because sensor and webcam position will not be disturbed relative to each other and they always move together. Is my assumption valid?
If it is valid, then my ultimate objective is to compute the rotation matrix of real webcam in the subsequent frames. I can calculate the rotation matrix of sensor in subsequent frames which is Rw2 for the subsequent frames. I have to find out Rw1 and I can not use any PnP algorithms. I want to compute it from the currently available information.
Let's consider the second frame for now.
I know R12 (which I am assuming is constant and I computed it in first frame) and Rw2 (sensor rotation matrix for the second frame). I have to find Rw1 for the second frame.
Rw1 = Rw2 * R21 = Rw2 * (R12)'
Is my method correct?
PS: (R)' means the transpose of R.