0

I am building a glove with IMU units on the tip of each finger, and one IMU on the palm. What I get from the IMU is the quaternion measurement.

Right now I'm facing 2 problems with the IMUs' measurement:

  1. The IMU's reference frame is not the same. If you rotate them physically in such a way that they all give identity quaternion digitally, you will get slightly different orientation physically.

  2. Each IMU's measurement drifts over time. With the same physical orientation, it will give you slightly different quaternion measurement after a minute of random physical rotations.

These 2 problems could be summed up as not knowing the reference frame of the IMU which changes all the time. Or not knowing which direction is the IMU's identity orientation.

Please tell me the resources or the method to help correct the drift.

Another thing that might be useful is that I would allow the user to calibrate the IMU by moving the fingers (naturally). I am thinking that if I know that all the natural movement of the fingers are in a certain limit, it might help somehow in calibrating the IMU, but I don't know the topic or the idea about integrating this information to use in calibration.

Do you know if there is any way to integrate limitation of the finger movement to help calibrate the IMU continuously?

off99555
  • 3,797
  • 3
  • 37
  • 49
  • 1
    Start from investigation of "Dead-Reckoning" for IMU. And "IMU fusion" "sensor fusion". some publications like this, estimate human trajectory with mounted IMU on the feet. http://www.car.upm-csic.es/lopsi/static/publicaciones/Congreso%20Internacional/WISP2009Jimenez.pdf Also basics in IMU fusion https://arxiv.org/pdf/1704.06053.pdf And availible systems with complete HAND tracking http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.701.4107&rep=rep1&type=pdf – minorlogic Oct 19 '18 at 13:15
  • Thank you very much. I'm looking into it.! – off99555 Oct 19 '18 at 13:27
  • Also, I'm more interested in the rotation like yaw and pitch more than the position as they can be applied to the joint of the finger directly. Just want to calibrate. – off99555 Oct 19 '18 at 13:38
  • Attitude estimation , is a "old" solved problem for IMU fusion. It is often used to estimate robot manipulator position and rotation. Knowing joints and attitude from "imu fusion" we can recover full state of manipulator or hand. "Imu fusion" usualy estimate rotation , and compensate drift by gravity from Accelerometer. – minorlogic Oct 22 '18 at 08:01

0 Answers0