0

I am working on finding arm orientation with Azure Kinect body tracking. So I have been working on finding the orientation angles of the shoulder. I have used a rotation matrix which has 3 vectors ( shoulder to elbow, shoulder to clavicle, their cross product). I then found out the quaternions and euler angles accordingly.

rotation_matrix_right_shoulder = np.array([vector_shoulder_to_elbow_right,vector_shoulder_to_clavicle_right,np.cross(vector_shoulder_to_elbow_right,vector_shoulder_to_clavicle_right)])
            #calculating quaternions
            r = R.from_matrix(rotation_matrix_right_shoulder)
            quat = r.as_quat()
   
            #converting to euler angles
            euler = r.as_euler('yzx', degrees=True)

            gamma_right_shoulder_degrees = euler[2]
            alpha_right_shoulder_degrees = euler[0]
            beta_right_shoulder_degrees = euler[1]
            #print values

I am having a problem with the values I am getting with these joints as they seem to jump angles and sometimes negate themselves when I move my arm a little bit also. They are able to react to different types of arm movement but I am not able to produce desired results. Please help me.

I tried the classic vector only approach where Itried to find the angles using the 3 vectors and their projections on different axes but ended with similar results. I then switched to finding quaternions for the same but didnt get desired results.

0 Answers0