0

I'm working on a project to try to Recognize Gestures with CyberGlove II.

When I perform GetDataGlove Demo (present in the SDK) I get this results. (See results in prompt)

Glove:
0 0.38016 -0.13131 -0.12843
1 -0.09696 -0.20426 -0.0753
2 0.1725 0.01804 -012612
3 0.36897 -0.30396 0.01051
4 0.31597 -0.273 -0.2964
Tracker:
0 0 0
0 0 0 0

Glove:
0 0.38016 -0.13131 -0.12843
1 -0.09696 -0.20426 -0.0753
2 0.1725 0.01804 -012612
3 0.36897 -0.30396 0.01051
4 0.31597 -0.273 -0.2964
Tracker:
0 0 0
0 0 0 0

Glove:
0 0.38016 -0.13131 -0.12843
1 -0.09696 -0.20426 -0.0753
2 0.1725 0.01804 -012612
3 0.36897 -0.30396 0.01051
4 0.31597 -0.273 -0.2964
Tracker:
0 0 0
0 0 0 0

...
  • If I do the same gesture the above values will be very similar but different.

  • The values 0 to 4 represents each finger and the 3 values represent each joint of the finger.

But I have two questions:

  1. How do you interpret those data in Gesture Recognition?
  2. Which is the maximum and minimum value that each sensors returns?

PS.: The most important to me is learn how interpret the Sensors Data. With some didactic sample. I want to recognize the gestures from Brazil Sign Languages.

Haroldo Gondim
  • 7,725
  • 9
  • 43
  • 62

1 Answers1

0

The CyberGlove API doesn't offer something native to this recognition. As the gesture data to a same gesture changes, I had to use Machine Learning. I used the Microsoft Azure Machine Learning and the service is based on JSON.

First I had to create a .csv with the gestures mapping that I would like to recognize. I did the mapping of gestures 100 times for each gesture. This amount that has been shown reasonable for recognition and the performance was also very interesting.

Haroldo Gondim
  • 7,725
  • 9
  • 43
  • 62