1

I would like to compare my poses obtained from a webcam to that of a pose obtained from an image. The base code for the pose estimation is from: https://github.com/opencv/opencv/blob/master/samples/dnn/openpose.py

How can I compare my own poses live-time with an image's pose, and return True if the two poses match within some threshold?

For instance, if I put my arms in a certain up to match an image of someone doing the same, how could I get a result of how close the match is?

What would be a way of doing this / where could I find more information on this?

Steak
  • 514
  • 3
  • 15

1 Answers1

1

enter image description here

As you can see here, the result of the detected human pose is indexed from 0 to 17.

You can use L2 distance to measure the distance between all pairs.

E.g., for the 0-th joint:

(J0[0] - J1[0])*(J0[0] - J1[0])

More about the output of openpose.

Actually, openpose give you not only (x,y) but also a confidence score from 0-1. You can get this score involved.

For example, in my project:

(J0[0] - J1[0])*(J0[0] - J1[0])*confidance
Frank
  • 1,151
  • 10
  • 22
  • How did you get the confidence score? – Steak Mar 08 '21 at 00:09
  • @Steak Hi you can print the shape of the output of openpose. It is (25,3) the 3rd col is the score of confidence. – Frank Mar 08 '21 at 00:30
  • I apologize - could you provide an example of where you print out the output of openpose? What parameter is that? I cannot seem to find it. – Steak Mar 09 '21 at 00:10