So I have an acceleration sensor that gives me acceleration data. The device is currently resting at a certain position, so the data looks like this (with some noise per axis):
ACCELX = 264
ACCELY = -43
ACCELZ = 964
Then there's a 3D model represting the device, and "all I want" is this 3D model to represent the real device's orientation. In my attempts to understand the usage of quaternions in .NET, here's the code I've gobbled up:
/* globals */
Vector3D PrevVector = new Vector3D(0, 0, 0);
ModelVisual3D model; // initialized and model file loaded
private async void TimerEvent()
{
RotateTransform3D rot = new RotateTransform3D();
QuaternionRotation3D q = new QuaternionRotation3D();
double x = 0, y = 0, z = 0;
List<Reading> results = await Device.ReadSensor();
foreach (Reading r in results)
{
switch (r.Type)
{
case "RPF_SEN_ACCELX":
x = r.Value;
break;
case "RPF_SEN_ACCELY":
y = r.Value;
break;
case "RPF_SEN_ACCELZ":
z = r.Value;
break;
}
}
double angle = Vector3D.AngleBetween(new Vector3D(x, y, z), PrevVector);
q.Quaternion = new Quaternion(new Vector3D(x, y, z), angle);
rot.Rotation = q;
model.Transform = rot;
PrevVector = new Vector3D(x, y, z);
}
Moving my real device does yield changes in the reported values, but the model on the screen just twitches in what seems to me random directions, little more far than from the noise and seemingly unrelated to how I rotate the real device. I'm fairly sure I'm constructing and using quaternions incorrectly. How would I do it right?
This is .NET with WPF. There's also HelixToolkit.WPF available, but I haven't seen any function to create quaternions from acceleration data in there. Higher level frameworks such as Unreal Engine or Unity are NOT available for this project.