I'm working on a project to try to Recognize Gestures
with CyberGlove II.
When I perform GetDataGlove Demo
(present in the SDK
) I get this results. (See results in prompt)
Glove:
0 0.38016 -0.13131 -0.12843
1 -0.09696 -0.20426 -0.0753
2 0.1725 0.01804 -012612
3 0.36897 -0.30396 0.01051
4 0.31597 -0.273 -0.2964
Tracker:
0 0 0
0 0 0 0
Glove:
0 0.38016 -0.13131 -0.12843
1 -0.09696 -0.20426 -0.0753
2 0.1725 0.01804 -012612
3 0.36897 -0.30396 0.01051
4 0.31597 -0.273 -0.2964
Tracker:
0 0 0
0 0 0 0
Glove:
0 0.38016 -0.13131 -0.12843
1 -0.09696 -0.20426 -0.0753
2 0.1725 0.01804 -012612
3 0.36897 -0.30396 0.01051
4 0.31597 -0.273 -0.2964
Tracker:
0 0 0
0 0 0 0
...
If I do the same gesture the above values will be very similar but different.
The values 0 to 4 represents each finger and the 3 values represent each joint of the finger.
But I have two questions:
- How do you interpret those data in
Gesture Recognition
? - Which is the maximum and minimum value that each sensors returns?
PS.: The most important to me is learn how interpret the Sensors Data. With some didactic sample. I want to recognize the gestures from Brazil Sign Languages
.