14

I am trying to modify the android-Camera2Basic code to capture a burst of pictures. However, I can't get the delay between pictures any faster than 200-300ms on my Nexus 5, running L 5.0.1.

I've tried a bunch of things, but this is the most basic. This is the only part of the Camera2Basic code that I've modified. My preview TextureView is only 50x50dp, but that shouldn't matter, right?

For what it's worth, this code only has delays around 50-100ms on my Nexus 6, with L 5.1.

private void captureStillPicture() {
    try {
        List<CaptureRequest> captureList = new ArrayList<CaptureRequest>();
        mPreviewRequestBuilder.addTarget(mImageReader.getSurface());

        for (int i=0;i<10;i++) {
            captureList.add(mPreviewRequestBuilder.build());
        }

        mCaptureSession.stopRepeating();
        mCaptureSession.captureBurst(captureList, cameraCaptureCallback, null);
        mPreviewRequestBuilder.removeTarget(mImageReader.getSurface());
    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
}

CameraCaptureSession.CaptureCallback cameraCaptureCallback = new CameraCaptureSession.CaptureCallback() {
    @Override
    public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request,
            TotalCaptureResult result) {
        Log.d("camera","saved");
        mPictureCounter++;
        if (mPictureCounter >= 10)
            unlockFocus();
    }
};
LittlePanda
  • 2,496
  • 1
  • 21
  • 33
acheroncaptain
  • 348
  • 1
  • 2
  • 13

2 Answers2

21

The issue you are running into is an artifact of the image output format you have requested. The JPEG encoding process puts a large stall time on the camera pipeline, so there is a lot of downtime between when one exposure ends and the next begins while this encoding happens.

The 30fps rate that is quoted can be achieved by setting the output image format on the ImageReader as YUV, since that is a more "native" output for the camera. This would be the way to store the images as they are captured, and then you would have to do JPEG encoding afterwards, separate of the camera's inline processing.

For example, on the Nexus 5 the output stall time for JPEG encoding is 243ms, which you have been observing. For YUV_420_888 output, it is 0ms. Likewise, because of their large size, RAW_SENSOR encoding introduces a stall time of 200ms.

Note also that even if you remove the stall time obstacle by choosing a "faster" format, there is still a minimum frame time, depending on the output image size. But for a Nexus 5's full resolution output, this is 33ms, which is what you were expecting.

The relevant information is in the camera metadata's StreamConfigurationMap object, here. Check out the getOutputStallDuration(int format, Size size) and getOutputMinFrameDuration(int format, Size size) methods for confirmation.

rcsumner
  • 1,623
  • 13
  • 12
  • 1
    YUV_420_888 definitely did it. Thanks for the help! – acheroncaptain Mar 30 '15 at 17:28
  • Actually, JPEG format may be faster than YUV (or RAW) on some devices. This depends on the firmware implementation of JPEG encoder, e.g. [DM3730](https://e2e.ti.com/support/dsp/davinci_digital_media_processors/f/537/t/431683) supports jpeg format streaming. Snapdragon 810 can produce 16 megapixel JPEG burst shots at 15 FPS [proof](https://www.qualcomm.com/media/documents/files/whitepaper-breakthrough-mobile-imaging-experiences.pdf): no way to have enough bandwidth for RAW or YUV. – Alex Cohn Oct 01 '17 at 10:04
7

Try to set following capture request parameters

requestBuilder = camDevice
        .createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);

requestBuilder.set(CaptureRequest.EDGE_MODE,
        CaptureRequest.EDGE_MODE_OFF);
requestBuilder.set(
        CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE,
        CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_ON);
requestBuilder.set(
        CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE,
        CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
requestBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE,
        CaptureRequest.NOISE_REDUCTION_MODE_OFF);
requestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
        CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);

requestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
requestBuilder.set(CaptureRequest.CONTROL_AWB_LOCK, true);

I am not sure about how fast info comes into CameraCaptureSession.CaptureCallback. It's not have image data, and it could be called before or after ImageReader.OnImageAvailableListener. Try to measure time between ImageReader.OnImageAvailableListener calls. And don't forget to read images and release them, because new images are not available if buffer is filled and images not released. For example:

private class imageAvailableListener implements
            ImageReader.OnImageAvailableListener {
        @Override
        public void onImageAvailable(ImageReader ir) {
            Log.i(TAG, "Time = " + System.currentTimeMillis());
            Image im = ir.acquireNextImage();
            im.close();
        }
    }

ImageReader mImageReader = ImageReader.newInstance(imageReaderWidth,
                    imageReaderHeight, ImageFormat.YUV_420_888, 2);
mImageReader.setOnImageAvailableListener(
                    new imageAvailableListener(), null);
Maxim Metelskiy
  • 1,389
  • 14
  • 29
  • I should've mentioned that I was using JPEG and not YUV. That was main difference in our code (which I just noticed in the second half of your answer). So, my bad. I appreciate the help though. – acheroncaptain Mar 30 '15 at 17:33
  • WORKING! reduce 100 ms on galaxy s3 - cyanogenmod 12.1 (lollipop) – Lucas Apr 29 '16 at 14:26