2

So I have an acceleration sensor that gives me acceleration data. The device is currently resting at a certain position, so the data looks like this (with some noise per axis):

ACCELX = 264
ACCELY = -43
ACCELZ = 964

Then there's a 3D model represting the device, and "all I want" is this 3D model to represent the real device's orientation. In my attempts to understand the usage of quaternions in .NET, here's the code I've gobbled up:

/* globals */
Vector3D PrevVector = new Vector3D(0, 0, 0);
ModelVisual3D model; // initialized and model file loaded

private async void TimerEvent() 
{
    RotateTransform3D rot = new RotateTransform3D();
    QuaternionRotation3D q = new QuaternionRotation3D();

    double x = 0, y = 0, z = 0;

    List<Reading> results = await Device.ReadSensor();

    foreach (Reading r in results)
    {
        switch (r.Type)
        {
            case "RPF_SEN_ACCELX":
                x = r.Value;
                break;

            case "RPF_SEN_ACCELY":
                y = r.Value;
                break;

            case "RPF_SEN_ACCELZ":
                z = r.Value;
                break;
        }
    }

    double angle = Vector3D.AngleBetween(new Vector3D(x, y, z), PrevVector);

    q.Quaternion = new Quaternion(new Vector3D(x, y, z), angle);

    rot.Rotation = q;
    model.Transform = rot;

    PrevVector = new Vector3D(x, y, z); 
}

Moving my real device does yield changes in the reported values, but the model on the screen just twitches in what seems to me random directions, little more far than from the noise and seemingly unrelated to how I rotate the real device. I'm fairly sure I'm constructing and using quaternions incorrectly. How would I do it right?

This is .NET with WPF. There's also HelixToolkit.WPF available, but I haven't seen any function to create quaternions from acceleration data in there. Higher level frameworks such as Unreal Engine or Unity are NOT available for this project.

Uwe Keim
  • 39,551
  • 56
  • 175
  • 291
draconigen
  • 93
  • 1
  • 12
  • I am not sure, but the first thing that comes in my mind was: Do you update the original position/rotation with the new values every iteration or do you update the previous position/rotation with the new values and this new values again in your next iteration and so on. For me it looks like the second one. Since you update your `PrevVector` – Martin Backasch Aug 03 '18 at 11:51
  • @MartinBackasch the line `model.Transform = rot;` supposedly sets the [transformation](https://msdn.microsoft.com/en-us/library/system.windows.media.media3d.modelvisual3d.transform.aspx) and updates the model immediately. I quickly verified that by removing this line and observing that the 3d model in my view ceases to twitch at all. – draconigen Aug 03 '18 at 12:08
  • I am not familiar with `ModelVisual3D`, but following my first thoughts. Maybe you can check, by try-and-error if `Transform` rotates your model if your Quaternion is the same or if it stands still. What are some sample results of `List results = await Device.ReadSensor();` Without moving I guess there are no input values. – Martin Backasch Aug 03 '18 at 12:25

2 Answers2

0

Is your sensor output rotation value accumulative or differences? Sometimes the output rotation is differences, and you need previous rotation value plus the differences to calculate the current new one.

You can try to save the previous quaternion and add current quaterion with previous quaternion to get the new accumulated rotation.

Lance H
  • 886
  • 1
  • 9
  • 14
  • The output turned out to be always relative to a static position. In the end, the quaternion was indeeed not the issue, but the application to the model: There is a `Transform3DGroup` class that I seem to have to use, otherwise changes won't apply. I haven't found a proper documentation for this, only noticed that being used in multiple tutorials and answers. – draconigen Aug 30 '18 at 07:46
0

Turns out my issue was of a completely different nature: I had to utilize a class called Transform3DGroup. This is how the code must be altered to enable a rotation around the Z axis:

/* globals */
ModelVisual3D model; // initialized and model file loaded
Transform3DGroup tg = new Transform3DGroup();

private async void TimerEvent() 
{
    RotateTransform3D rot = new RotateTransform3D();
    QuaternionRotation3D q = new QuaternionRotation3D();

    double x = 0, y = 0, z = 0;

    List<Reading> results = await Device.ReadSensor();


    foreach (Reading r in results)
    {
        switch (r.Type)
        {
            case "RPF_SEN_ACCELX":
                x = r.Value;
                break;

            case "RPF_SEN_ACCELY":
                y = r.Value;
                break;

            case "RPF_SEN_ACCELZ":
                z = r.Value;
                angle = GetAngle(x, y).ToDegrees();
                q.Quaternion = new Quaternion(new Vector3D(0, 0, 1), angle);
                break;
        }

        rot.Rotation = q;

        tg.Children.Clear();
        tg.Children.Add(rot);

        model.Transform = tg; // Use Transform3DGroup!
    }
}

I haven't found any documentation on the obligatory use of Transform3DGroup.

For the sake of completion, here are the internals of GetAngle(), as derived by Bill Wallis on Math Overflow:

private double GetAngle(double x, double y)
{
    if (x > 0)
    {
        return 2 * Math.PI - Math.Atan(y / x);
    }
    else if (x < 0)
    {
        return Math.PI - Math.Atan(y / x);
    }
    else // x == 0
    {
        return 2 * Math.PI - Math.Sign(y) * Math.PI / 2;
    }
}

And the extensions to the double type to transform doubles between radians and degrees (outside of class, within namespace):

public static class NumericExtensions
{
    public static double ToRadians(this double val)
    {
        return (Math.PI / 180) * val;
    }
    public static double ToDegrees(this double val)
    {
        return (180 / Math.PI) * val;
    }
}
draconigen
  • 93
  • 1
  • 12