2

I am trying to get the coordinates of the leftEye and rightEye and draw a line from leftEye to rightEye but the coordinates returned are null. I am using Nexus 4 to test the application. Nexus 4 doesn't support this feature?? I can draw a rectangle around the detected face without any problem. For reference, I have attached my codes for detection of the eye coordinates.

try{
         float x1 = detectedFaces[i].leftEye.x;
         float y1 = detectedFaces[i].leftEye.y;
         float x2 = detectedFaces[i].rightEye.y;
         float y2 = detectedFaces[i].rightEye.y;

         //Converting from driver coordinate to view coordinate
         float Xx1 = (x1+1000) * vWidth/2000;
         float Yy1  = (y1+1000) * vHeight/2000;
         float Xx2 = (x2+1000) * vWidth/2000;
         float Yy2  = (y2+1000) * vHeight/2000;

         canvas.drawLine(Xx1, Yy1, Xx2, Yy2, drawingPaint);
         }
         catch(Exception e){
             Log.e(TAG, "Error: " +e.getMessage());
         }

Logcat

11-15 16:37:52.895: E/Take_Picture(1304): Error: null
11-15 16:37:53.115: E/Take_Picture(1304): Error: null
11-15 16:37:53.286: E/Take_Picture(1304): Error: null
Jag
  • 517
  • 2
  • 5
  • 17

1 Answers1

1

Reading the Android Reference: "This is an optional field, may not be supported on all devices. If not supported, the value will always be set to null."

I've tested on several devices (Nexus 4 and 5, Samsung Galaxy S4 and S5, Sony Experia, HTC One, etc, etc) and never worked. This might be a good question for the Android dev team at Google.

The options you have would be use OpenCv library(http://opencv.org/platforms/android.html).

Another option would be to do the detection after taking the picture instead of doing it live using the FaceDetector class, not sure if you're looking to do it live.

See an example:

public class TutorialOnFaceDetect1 extends Activity {
private MyImageView mIV;
private Bitmap mFaceBitmap;
private int mFaceWidth = 200;
private int mFaceHeight = 200;   
private static final int MAX_FACES = 10;
private static String TAG = "TutorialOnFaceDetect";

@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    mIV = new MyImageView(this);
    setContentView(mIV, new LayoutParams(LayoutParams.WRAP_CONTENT,    LayoutParams.WRAP_CONTENT));       

    // load the photo
    Bitmap b = BitmapFactory.decodeResource(getResources(), R.drawable.face3); 
    mFaceBitmap = b.copy(Bitmap.Config.RGB_565, true); 
    b.recycle();

    mFaceWidth = mFaceBitmap.getWidth();
    mFaceHeight = mFaceBitmap.getHeight();  
    mIV.setImageBitmap(mFaceBitmap); 

    // perform face detection and set the feature points
    setFace();

    mIV.invalidate();
}

public void setFace() {
    FaceDetector fd;
    FaceDetector.Face [] faces = new FaceDetector.Face[MAX_FACES];
    PointF midpoint = new PointF();
    int [] fpx = null;
    int [] fpy = null;
    int count = 0;

    try {
        fd = new FaceDetector(mFaceWidth, mFaceHeight, MAX_FACES);        
        count = fd.findFaces(mFaceBitmap, faces);
    } catch (Exception e) {
        Log.e(TAG, "setFace(): " + e.toString());
        return;
    }

    // check if we detect any faces
    if (count > 0) {
        fpx = new int[count];
        fpy = new int[count];

        for (int i = 0; i < count; i++) { 
            try {                 
                faces[i].getMidPoint(midpoint);                  

                fpx[i] = (int)midpoint.x;
                fpy[i] = (int)midpoint.y;
            } catch (Exception e) { 
                Log.e(TAG, "setFace(): face " + i + ": " + e.toString());
            }            
        }      
    }

    mIV.setDisplayPoints(fpx, fpy, count, 0);
} 

}

David Guerrero
  • 1,080
  • 7
  • 7