2

I am exploring wonderful mobile vision apis, I am working on Face Tracker example and looking for a solution where I can find out whether mouth is open or not. e.g. person is yawning. There is not direct way like face.getIsLeftEyeOpenProbability();

So I am thinking that I need to find out x,y coordinates of both left and right mouth find out the difference and figure out whether mouse is open or not. I am not sure if this will work or not.

But is there any other way we can find out whether mouth is open or closed ?

Hardik Trivedi
  • 5,677
  • 5
  • 31
  • 51

5 Answers5

2

Mobile Vision API does not give direct support for Mouth open/close detection. But this code may help you. I have tested and working charm on my device.

 @Override
    public void draw(Canvas canvas) {

        Face face = mFace;

        if (face == null) {
            return;
        }

        if ((contains(face.getLandmarks(), 11) != 99)
                && (contains(face.getLandmarks(), 5) != 99)
                && (contains(face.getLandmarks(), 6) != 99)
                ) {

            Log.i(TAG, "draw: Mouth Open >> found all the points");

            /**
             * for bottom mouth
             */
            int cBottomMouthX;
            int cBottomMouthY;
            if (FaceTrackerActivity.mIsFrontFacing) {
                cBottomMouthX = (int) translateX(face.getLandmarks().get(contains(face.getLandmarks(), 0)).getPosition().x);
                cBottomMouthY = (int) translateY(face.getLandmarks().get(contains(face.getLandmarks(), 0)).getPosition().y);

                Log.i(TAG, "draw: Condition Bottom mouth >> cBottomMouthX >> " + cBottomMouthX + "    cBottomMouthY >> " + cBottomMouthY);


            } else {
                cBottomMouthX = (int) translateX(face.getLandmarks().get(contains(face.getLandmarks(), 0)).getPosition().x);
                cBottomMouthY = (int) translateY(face.getLandmarks().get(contains(face.getLandmarks(), 0)).getPosition().y);
            }
            canvas.drawCircle(cBottomMouthX, cBottomMouthY, 10, mPaint);

            /**
             * for left mouth
             */
            int cLeftMouthX;
            int cLeftMouthY;
            if (FaceTrackerActivity.mIsFrontFacing) {
                cLeftMouthX = (int) translateX(face.getLandmarks().get(contains(face.getLandmarks(), 5)).getPosition().x);
                cLeftMouthY = (int) translateY(face.getLandmarks().get(contains(face.getLandmarks(), 5)).getPosition().y);

                Log.i(TAG, "draw: Condition LEft mouth >> cLeftMouthX >> " + cLeftMouthX + "    cLeftMouthY >> " + cLeftMouthY);


            } else {
                cLeftMouthX = (int) translateX(face.getLandmarks().get(contains(face.getLandmarks(), 5)).getPosition().x);
                cLeftMouthY = (int) translateY(face.getLandmarks().get(contains(face.getLandmarks(), 5)).getPosition().y);
            }
            canvas.drawCircle(cLeftMouthX, cLeftMouthY, 10, mPaint);

            /**
             * for Right mouth
             */
            int cRightMouthX;
            int cRightMouthY;
            if (FaceTrackerActivity.mIsFrontFacing) {
                cRightMouthX = (int) translateX(face.getLandmarks().get(contains(face.getLandmarks(), 11)).getPosition().x);
                cRightMouthY = (int) translateY(face.getLandmarks().get(contains(face.getLandmarks(), 11)).getPosition().y);

                Log.i(TAG, "draw: Condition Right mouth >> cRightMouthX >> " + cRightMouthX + "    cRightMouthY >> " + cRightMouthY);


            } else {
                cRightMouthX = (int) translateX(face.getLandmarks().get(contains(face.getLandmarks(), 11)).getPosition().x);
                cRightMouthY = (int) translateY(face.getLandmarks().get(contains(face.getLandmarks(), 11)).getPosition().y);
            }
            canvas.drawCircle(cRightMouthX, cRightMouthY, 10, mPaint);

            float centerPointX = (cLeftMouthX + cRightMouthX) / 2;
            float centerPointY = ((cLeftMouthY + cRightMouthY) / 2) - 20;

            canvas.drawCircle(centerPointX, centerPointY, 10, mPaint);

            float differenceX = centerPointX - cBottomMouthX;
            float differenceY = centerPointY - cBottomMouthY;

            Log.i(TAG, "draw: difference X >> " + differenceX + "     Y >> " + differenceY);

            if (differenceY < (-60)) {
                Log.i(TAG, "draw: difference - Mouth is OPENED ");
            } else {
                Log.i(TAG, "draw: difference - Mouth is CLOSED ");
            }
         }
     }

And here is an another method.

int contains(List<Landmark> list, int name) {
    for (int i = 0; i < list.size(); i++) {
        if (list.get(i).getType() == name) {
            return i;
        }
    }
    return 99;
}

P.S - This code will find find center point cordinates of Left and Right mouth and find the difference between Bottom mouth cordinates and center points cordinates.

Kuls
  • 2,047
  • 21
  • 39
  • Hello @KulsDroid may i know where you are getting this statement mIsFrontFacing method. I am getting error here.. if (FaceTrackerActivity.mIsFrontFacing) { – Shadow Nov 06 '17 at 17:07
2

We can detect mouth is open or not using angle between mouthLeftPosition , mouthRightPosition and mouthBottomPosition.

Calculate angle using:

double float ratio = (AB * AB + AC * AC - BC * BC) /( 2 * AC * AB); degree = Math.acos(ratio)*(180/Math.PI);

if (degree < (110)) {
  System.out.println("Mouth is open");
}else {
  System.out.println("Mouth is close");
}
Developer
  • 151
  • 8
  • I know this is super old, but.. WHAT? This looks like it'd be really helpful, but I have no idea what AB, AC, BC represent. If you see this it'd be awesome if you edited with some clarification. – Lenny Feb 17 '22 at 06:39
  • For anyone else finding this. This is a really good breakdown and I was able to get pretty reliable mouth detection by computing the angles on a triangle formed by the 3 points of the mouth. https://www.geeksforgeeks.org/find-angles-given-triangle/ – Lenny Feb 17 '22 at 07:32
1

Unfortunately Mobile Vision API doesn't support mouth open detection.

liuyl
  • 165
  • 7
1

The API does allow tracking facial landmarks for the left, right, and bottom of the mouth:

https://developers.google.com/android/reference/com/google/android/gms/vision/face/Landmark

But no, there isn't explicit mouth open detection in the API.

pm0733464
  • 2,862
  • 14
  • 16
0

Calculating the distance between left and right mouth points will not work well in every case as when user will be away from camera, this distance will be shorter compared to when user is near the camera, so there is no standard to get a constant threshold to detect when mouth is opened. I think, a better way can be to calculate the angle between left->bottom and right->bottom of mouth lines. When the angle reduces, it will depict a mouth open event.

I have been trying to achieve the same and there is no way to find it using a simple method. So, I thought about this solution and will implement it on my end as well.