0

I have IMU sensor that gives me the raw data such as orientation, Angular and Linear acceleration. Im using ROS and doing some Gazebo UUV simulation. Furthermore, I want to get linear velocity from the raw IMU data. If I do integration over time there will be accumulated error and will not be accurate with the time when for example the robot makes turns. So If I use

acceleration_x = (msg->linear_acceleration.x + 9.81 * sin(pitch)) * cos(pitch); 
acceleration_y = (msg->linear_acceleration.y - 9.81 * sin(roll)) * cos(roll);

So integrating linear acceleration is very bad,

Velocity_x= Velocity_old_x+acceleration_x*dt;

because integrates the acceleration without taking into account any possible rotation of the sensor, which means that the results will probably be terrible if the sensor rotates at all. So I need some ROS package that takes into account all this transformation and gives me the most accurate estimation of the linear velocity. Any Help? Thanks

Bob9710
  • 205
  • 3
  • 15
  • I would start from here: https://github.com/Wojtek120/IMU-velocity-and-displacement-measurements – IFeelGood Sep 22 '21 at 12:32
  • Ok. But thats a complete device. I dont need additional device at this stage. I already have IMU that provides orientation in x,y,z Linear (x,y,z) , angular velocity(x,y,z) and linear acceleration (x,y,z) . Need algorithm to do the right transform and filtering to obtain the real linear velocity – Bob9710 Sep 22 '21 at 12:44
  • what type of device is this in (robot arm, vehicle, helicopter, plane, etc?) – Sneaky Polar Bear Sep 22 '21 at 14:16
  • its UUV, underwater robot. But at the moment only in Gazebo simulation. But will have it in a real. So I already have a sensor unit on board of the robot. Just need the algorithm or better a whole package to get the linear velocity using IMU data – Bob9710 Sep 22 '21 at 14:41
  • Did you read the link to the end? There's a description of algorithm to use for a single IMU. – IFeelGood Sep 22 '21 at 14:59
  • you mean this? Block diagram Below is a complete block diagram of the algorithm for obtaining speed and displacements of device. – Bob9710 Sep 22 '21 at 15:17
  • the block diagram right? and no code for that right? – Bob9710 Sep 22 '21 at 15:17
  • I found also this one, https://robotics.stackexchange.com/questions/16757/what-is-the-algorithm-to-get-position-linear-displacement-and-linear-velocity . So bit confuse what the difference between your link and my link? – Bob9710 Sep 22 '21 at 15:19
  • This is really what something like an EKF is for. You can feed just an imu into the `robot_ekf` ros package. – BTables Sep 22 '21 at 15:34
  • which one? what is like EKF – Bob9710 Sep 22 '21 at 15:42
  • i dont think robot_pose_ekf can do all the tf transformation and integration need it for obtaining linear velocity. this steps in this link https://robotics.stackexchange.com/questions/16757/what-is-the-algorithm-to-get-position-linear-displacement-and-linear-velocity from 1 to 6 not sure EKF can do – Bob9710 Sep 22 '21 at 15:46
  • I can compare the Linear Velocity values coming fro EKF and from the algorithm from yours or the my link ad see the difference although I dont have the ground true values – Bob9710 Sep 22 '21 at 15:58
  • the problem is that integration is need it to obtain velocity from the acceleration . And with time that error will accumulate. – Bob9710 Sep 23 '21 at 17:02

1 Answers1

0

I would first recommend that you try fitting your input sensor data into an EKF or UKF node from the robot_localization package. This package is the most used & most optimized pose estimation package in the ROS ecosystem.

It is designed to handle 3D sensor input, but you would have to configure the parameters (there are no real defaults, all config). Besides the configuration docs above, the github has good examples of yaml parameter configurations (Ex.) (you'll want a separate file from the launch file) and example launch files (Ex.).

If you're talking about minimizing accumulated error, feeding IMU or odometry-velocity data into an EKF/UKF will give you the odom->base_link frame transform, and that is the best you can do, by definition. Absolute pose error will creep in and accumulate, unless you have an absolute reference frame measurement. (Ex GPS or camera/lidar processed position estimate). Specifc to how you asked for velocity, stepping back a derivative, unless you have an absolute reference frame estimate for your velocity or pose, you will have accumulated error just integrating your acceleration, and that is the best you can do, by definition.

If it's an underwater robot, you may be able to get a velocity / water flow speed sensor attached to your vehicle. Or you may be able to use camera / lidar / sonar with processing to get an absolute reference frame or at least a position difference between execution cycles. Otherwise your precision & results are limited to the sensors you have.

JWCS
  • 1,120
  • 1
  • 7
  • 17