0

I'm trying to make an app which broadcast video through internet, currently I am using the deprecated Camera API, adding a Camera.PreviewCallback to the Camera object and then sending the byte array which comes in the onPreviewFrame() method from Camera.PreviewCallback.

But now I want to test the new Camera2 API, I am watching at the Camera2BasicTutorial , and I think that I need to make a CameraCaptureSession.CaptureCallback object to get the image byte array, something like the tutorial says:

CameraCaptureSession.CaptureCallback CaptureCallback
                = new CameraCaptureSession.CaptureCallback() {

            @Override
            public void onCaptureCompleted(@NonNull CameraCaptureSession session,
                                           @NonNull CaptureRequest request,
                                           @NonNull TotalCaptureResult result) {
                showToast("Saved: " + mFile);
                Log.d(TAG, mFile.toString());
                unlockFocus();


            }
        };

And then add it to the CameraCaptureSession:

mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);

The problem is that I don't know how to retrieve each image byte array from any of the parameters in onCaptureCompleted() from the CaptureCallback.

Any help?

svprdga
  • 2,171
  • 1
  • 28
  • 51
  • https://sites.google.com/site/averagelosercom/android/android-camera-api-v2-preview – Sree Oct 05 '15 at 11:44

2 Answers2

0

You're kind of right- you can't get the image data from the onCaptureCompleted() method. That callback only returns the metadata about the exposure for your own bookkeeping. The actual image information gets sent to whatever Surface you indicated in the exposure's CaptureRequest.

rcsumner
  • 1,623
  • 13
  • 12
  • This is not a specific code issue, but rather an understanding of the basic framework/system of camera2. It is best outlined here: https://developer.android.com/reference/android/hardware/camera2/package-summary.html When it comes to an actual Surface where the image data will appear, you will probably want to use the Surface of a MediaCodec or MediaRecorder, and I am unfamiliar with those and so cannot provide code examples. – rcsumner Oct 07 '15 at 17:53
0

At least I realized how to do what I wanted, from the Camera2BasicTutorial, I did the following changes to the Camera2BasicFragment class:

  1. Modify captureStillPicture() method to delete stuff which I determined that was unneccessary with my broadcast needs, also don't allow this method to stop the repeating mode:

    private void captureStillPicture() {
    try {
        final Activity activity = getActivity();
        if (null == activity || null == mCameraDevice) {
            return;
        }
    
        final CaptureRequest.Builder captureBuilder =
                mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
        captureBuilder.addTarget(mImageReader.getSurface());
    
        CameraCaptureSession.CaptureCallback CaptureCallback
                = new CameraCaptureSession.CaptureCallback() {
    
            @Override
            public void onCaptureCompleted(@NonNull CameraCaptureSession session,
                                           @NonNull CaptureRequest request,
                                           @NonNull TotalCaptureResult result) {
            }
        };
    
        mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
    }
    
  2. In createCameraPreviewSession() method, disable the automaticall flash:

    // When the session is ready, we start displaying the preview.
                        mCaptureSession = cameraCaptureSession;
                        try {
                            // Auto focus should be continuous for camera preview.
                            mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                                    CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                            // Flash is automatically enabled when necessary.
    // mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE,
    // CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
    
                            // Finally, we start displaying the camera preview.
                            mPreviewRequest = mPreviewRequestBuilder.build();
                            mCaptureSession.setRepeatingRequest(mPreviewRequest,
                                    mCaptureCallback, mBackgroundHandler);
                        } catch (CameraAccessException e) {
                            e.printStackTrace();
                        }
    
  3. I created a boolean var to detect if there is an image currently being under process, to not queue all the frames that the camera captures; and another boolean to track if there is a frame sending through internet:

    private boolean mWorking = false;
    private boolean mNetworkWorking = false;
    
  4. Modify the CaptureCallback object to run the captureStillPicture() method in each frame (only if there is no frame processing at the moment).

                case STATE_PREVIEW: {
    
                if (!mWorking){
                    Log.d(TAG, "capturing..");
                    mWorking = true;
    
                    mBackgroundHandler.post(new Runnable() {
                        @Override
                        public void run() {
                            captureStillPicture();
                        }
                    });
    
                } else {
                    Log.d(TAG, "thread working, doing nothing");
                }
    
    
                break;
    
  5. Finally, read the frame and send it; I achieved this modifying the OnImageAvailableListener object:

    private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
        = new ImageReader.OnImageAvailableListener() {
    
    @Override
    public void onImageAvailable(final ImageReader reader) {
    
        // Process the image.
        Image image = reader.acquireNextImage();
        ByteBuffer buffer = image.getPlanes()[0].getBuffer();
        final byte[] bytes = new byte[buffer.remaining()];
        buffer.get(bytes);
        image.close();
    
        if (!mNetworkWorking){
            Thread thread = new Thread(){
                @Override
                public void run(){
    
                    mNetworkWorking = true;
    
                    HttpResponse response = null;
                    HttpClient client = new DefaultHttpClient();
                    HttpPost post = new HttpPost(mBroadcastUrl);
                    post.setEntity(new ByteArrayEntity(bytes));
    
                    try {
                        response = client.execute(post);
                    } catch (ClientProtocolException e) {
                        if (BuildConfig.LOCAL_LOG)
                            Log.w(TAG, "ClientProtocolException: "+e.getMessage());
                    } catch (IOException e) {
                        if (BuildConfig.LOCAL_LOG)
                            Log.w(TAG, "IOException: "+e.getMessage());
                    }
    
                    mNetworkWorking = false;
    
                }
            };
    
            thread.setName("networkThread");
            thread.setPriority(Thread.MAX_PRIORITY);
            thread.start();
        }
    
        mWorking = false;
    
    }
    
    };
    

That's all.

svprdga
  • 2,171
  • 1
  • 28
  • 51