1

I'm a newbie at coding for Android (but not totally to Java) and I'm trying to make a simple camera app for Google Glass. I want it to automatically take a photo every couple of minutes and process the input, but Google glass forces you to "tap to accept" for every photo if you use the native Camera implementation. So I'm trying to use the Android Camera API instead to take photos so I can skip this "tap to accept".

However, while the preview shows, the PictureCallback is never called, and so a NullPointerException is thrown when trying to send the results back to the main Activity.

The current code is a jumble of all sorts of potential work-arounds on the web, sorry if it's messy!

My Camera Activity class:

package com.example.cerveau.blah;

import android.app.Activity;
import android.content.Intent;
import android.hardware.Camera;
import android.hardware.Camera.PictureCallback;
import android.net.Uri;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.widget.FrameLayout;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;

public class CameraActivity extends Activity {

    private Camera mCamera;
    private CameraPreview mPreview;
    private Intent resultIntent;
    private PictureCallback mPicture;
    public static final int MEDIA_TYPE_IMAGE = 1;
    public static final int MEDIA_TYPE_VIDEO = 2;
    private static final String TAG = "CameraActivity";

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.recognize_places);

        // Create an instance of Camera
        mCamera = getCameraInstance();

        // Make the callback
        mPicture = new PictureCallback() {

            private static final String TAG = "PictureCallback";

            @Override
            public void onPictureTaken(byte[] data, Camera camera) {

                File pictureFile = getOutputMediaFile(MEDIA_TYPE_IMAGE);
                if (pictureFile == null){
                    Log.d(TAG, "Error creating media file, check storage permissions: ");
                    return;
                }

                try {
                    FileOutputStream fos = new FileOutputStream(pictureFile);
                    fos.write(data);
                    fos.close();
                } catch (FileNotFoundException e) {
                    Log.d(TAG, "File not found: " + e.getMessage());
                } catch (IOException e) {
                    Log.d(TAG, "Error accessing file: " + e.getMessage());
                }
                Log.d(TAG, "Callback made and picture taken!");
            }
        };

        // Create our Preview view and set it as the content of our activity.
        mPreview = new CameraPreview(this, mCamera);
        FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
        preview.addView(mPreview);
        Log.d(TAG, "Preview made!");

        mCamera.startPreview();

        // have a delay so the camera can set up
        try {
            Thread.sleep(1000);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }

        mCamera.takePicture(null, null, mPicture);
        setIntent(getOutputMediaFileUri(MEDIA_TYPE_IMAGE));
        releaseCamera();

    }


    public void setIntent(Uri photoURI){
        resultIntent = new Intent();
        resultIntent.setData(photoURI);
        setResult(Activity.RESULT_OK, resultIntent);
        finish();
    }

    public static Camera getCameraInstance(){
        Camera c = null;
        try {
            c = Camera.open(); // attempt to get a Camera instance
        }
        catch (Exception e){
            // Camera is not available (in use or does not exist)
        }

        // Parameters needed for Google Glass
        c.setDisplayOrientation(0);
        Camera.Parameters params = c.getParameters();
        params.setPreviewFpsRange(30000, 30000);
        params.setJpegQuality(90);
// hard-coding is bad, but I'm a bit lazy
        params.setPictureSize(640, 480);
        params.setPreviewSize(640, 480);
        c.setParameters(params);

        return c; // returns null if camera is unavailable
    }


    @Override
    protected void onPause() {
        super.onPause();
        releaseCamera();              // release the camera immediately on pause event
    }

    private void releaseCamera(){
        if (mCamera != null){
            mCamera.release();        // release the camera for other applications
            mCamera = null;
        }
    }

    /** Create a file Uri for saving an image or video */
    private static Uri getOutputMediaFileUri(int type){
        return Uri.fromFile(getOutputMediaFile(type));
    }

    /** Create a File for saving an image or video */
    private static File getOutputMediaFile(int type){
        // To be safe, you should check that the SDCard is mounted
        // using Environment.getExternalStorageState() before doing this.

        File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
                Environment.DIRECTORY_PICTURES), "MyCameraApp");
        // This location works best if you want the created images to be shared
        // between applications and persist after your app has been uninstalled.

        // Create the storage directory if it does not exist
        if (! mediaStorageDir.exists()){
            if (! mediaStorageDir.mkdirs()){
                Log.d("MyCameraApp", "failed to create directory");
                return null;
            }
        }

        // Create a media file name
        String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
        File mediaFile;
        if (type == MEDIA_TYPE_IMAGE){
            mediaFile = new File(mediaStorageDir.getPath() + File.separator +
                    "IMG_"+ timeStamp + ".jpg");
        } else if(type == MEDIA_TYPE_VIDEO) {
            mediaFile = new File(mediaStorageDir.getPath() + File.separator +
                    "VID_"+ timeStamp + ".mp4");
        } else {
            return null;
        }

        return mediaFile;
    }
}

I call it in the main Activity like this:

    Intent intent = new Intent(this, CameraActivity.class);
    startActivityForResult(intent, CAPTURE_IMAGE_ACTIVITY_REQUEST_CODE);

I already have all the necessarily permissions in my AndroidManifest:

<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-feature android:name="android.hardware.camera" android:required="true"/>

Here's the error log:

10-30 18:00:58.599  11361-11361/com.example.cerveau.recognizeplaces D/OpenGLRenderer﹕ Enabling debug mode 0
10-30 18:00:58.833  11361-11361/com.example.cerveau.recognizeplaces D/CameraActivity﹕ Preview made!
10-30 18:01:08.654  11361-11361/com.example.cerveau.recognizeplaces I/Choreographer﹕ Skipped 601 frames!  The application may be doing too much work on its main thread.
10-30 18:01:08.677  11361-11361/com.example.cerveau.recognizeplaces I/RecogPlaces﹕ Got to onActivity
10-30 18:01:08.677  11361-11361/com.example.cerveau.recognizeplaces I/RecogPlaces﹕ Request code: 100, Result code: -1, what it wants: -1
10-30 18:01:08.677  11361-11361/com.example.cerveau.recognizeplaces I/RecogPlaces﹕ Got inside the IF
10-30 18:01:08.685  11361-11361/com.example.cerveau.recognizeplaces D/AndroidRuntime﹕ Shutting down VM
10-30 18:01:08.685  11361-11361/com.example.cerveau.recognizeplaces W/dalvikvm﹕ threadid=1: thread exiting with uncaught exception (group=0x41600bd8)
10-30 18:01:08.685  11361-11361/com.example.cerveau.recognizeplaces E/AndroidRuntime﹕ FATAL EXCEPTION: main
    Process: com.example.cerveau.recognizeplaces, PID: 11361
    java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=100, result=-1, data=Intent { dat=file:///storage/emulated/0/Pictures/MyCameraApp/IMG_20141030_180059.jpg }} to activity {com.example.cerveau.recognizeplaces/com.example.cerveau.recognizeplaces.LiveCardMenuActivity}: java.lang.NullPointerException
            at android.app.ActivityThread.deliverResults(ActivityThread.java:3391)
            at android.app.ActivityThread.handleSendResult(ActivityThread.java:3434)
            at android.app.ActivityThread.access$1300(ActivityThread.java:138)
            at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1284)
            at android.os.Handler.dispatchMessage(Handler.java:102)
            at android.os.Looper.loop(Looper.java:149)
            at android.app.ActivityThread.main(ActivityThread.java:5045)
            at java.lang.reflect.Method.invokeNative(Native Method)
            at java.lang.reflect.Method.invoke(Method.java:515)
            at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:786)
            at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:602)
            at dalvik.system.NativeStart.main(Native Method)
     Caused by: java.lang.NullPointerException
            at java.io.File.fixSlashes(File.java:185)
            at java.io.File.<init>(File.java:134)
            at com.example.cerveau.recognizeplaces.LiveCardMenuActivity.processPictureWhenReady(LiveCardMenuActivity.java:166)
            at com.example.cerveau.recognizeplaces.LiveCardMenuActivity.onActivityResult(LiveCardMenuActivity.java:157)
            at android.app.Activity.dispatchActivityResult(Activity.java:5430)
            at android.app.ActivityThread.deliverResults(ActivityThread.java:3387)
            at android.app.ActivityThread.handleSendResult(ActivityThread.java:3434)
            at android.app.ActivityThread.access$1300(ActivityThread.java:138)
            at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1284)
            at android.os.Handler.dispatchMessage(Handler.java:102)
            at android.os.Looper.loop(Looper.java:149)
            at android.app.ActivityThread.main(ActivityThread.java:5045)
            at java.lang.reflect.Method.invokeNative(Native Method)
            at java.lang.reflect.Method.invoke(Method.java:515)
            at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:786)
            at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:602)
            at dalvik.system.NativeStart.main(Native Method)
10-30 18:01:09.130  11361-11361/com.example.cerveau.recognizeplaces I/Process﹕ Sending signal. PID: 11361 SIG: 9

Thanks in advance!

choupettes
  • 31
  • 4

1 Answers1

0

Your processPictureWhenReady() method from the LiveCardMenuActivity class is doing something wrong.

First, this is a very bad thing:

Thread.sleep(1000);

Don't ever, never, do this on the UI thread.
You should directly call

mCamera.startPreview();
mCamera.takePicture(null, null, mPicture);

And at the end of the onPictureTaken() callback:

setIntent(pictureFile.getPath());
Simon Marquis
  • 7,248
  • 1
  • 28
  • 43
  • Thanks for your suggestions... I tried them and nothing is working! I can't seem to find a Camera API implementation for me on Google Glass that will work... currently trying this one : http://stackoverflow.com/questions/23073180/glass-slow-camera-fileobserver-notification-xe12-using-action-image-capt But the camera preview won't come up and it won't take a picture. Am I possibly doing something wrong on the MainActivity end? I'm calling it as an Intent and then I startActivityForResult. Also this uses a SurfaceView and mine is just showing up as a big cloud with a frowny face on my app... – choupettes Nov 27 '14 at 06:18
  • Have you looked at the Glass developer guides? https://developers.google.com/glass/develop/gdk/camera I've used this to take pictures with the glass device. – Kurt Mueller Jan 11 '15 at 00:56