8

I am implementing an app that uses real-time image processing on live images from the camera. It was working, with limitations, using the now deprecated android.hardware.Camera; for improved flexibility & performance I'd like to use the new android.hardware.camera2 API. I'm having trouble getting the raw image data for processing however. This is on a Samsung Galaxy S5. (Unfortunately, I don't have another Lollipop device handy to test on other hardware).

I got the overall framework (with inspiration from the 'HdrViewFinder' and 'Camera2Basic' samples) working, and the live image is drawn on the screen via a SurfaceTexture and a GLSurfaceView. However, I also need to access the image data (grayscale only is fine, at least for now) for custom image processing. According to the documentation to StreamConfigurationMap.isOutputSupportedFor(class), the recommended surface to obtain image data directly would be ImageReader (correct?).

So I've set up my capture requests as:

mSurfaceTexture.setDefaultBufferSize(640, 480);
mSurface = new Surface(surfaceTexture);
...
mImageReader = ImageReader.newInstance(640, 480, format, 2);
...
List<Surface> surfaces = new ArrayList<Surface>();
surfaces.add(mSurface);
surfaces.add(mImageReader.getSurface());
...
mCameraDevice.createCaptureSession(surfaces, mCameraSessionListener, mCameraHandler);

and in the onImageAvailable callback for the ImageReader, I'm accessing the data as follows:

Image img = reader.acquireLatestImage();
ByteBuffer grayscalePixelsDirectByteBuffer = img.getPlanes()[0].getBuffer();

...but while (as said) the live image preview is working, there's something wrong with the data I get here (or with the way I get it). According to

mCameraInfo.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputFormats();

...the following ImageFormats should be supported: NV21, JPEG, YV12, YUV_420_888. I've tried all (plugged in for 'format' above), all support the set resolution according to getOutputSizes(format), but none of them give the desired result:

  • NV21: ImageReader.newInstance throws java.lang.IllegalArgumentException: NV21 format is not supported
  • JPEG: This does work, but it doesn't seem to make sense for a real-time application to go through JPEG encode and decode for each frame...
  • YV12 and YUV_420_888: this is the weirdest result -- I can see get the grayscale image, but it is flipped vertically (yes, flipped, not rotated!) and significantly squished (scaled significantly horizontally, but not vertically).

What am I missing here? What causes the image to be flipped and squished? How can I get a geometrically correct grayscale buffer? Should I be using a different type of surface (instead of ImageReader)?

Any hints appreciated.

Steffen G.
  • 195
  • 1
  • 9
  • When you add two Target to deal with raw frames, did you get frames continuously. I am upgrading my app on Api 2 as well but It freezes the app with add two output Target. mPreviewRequestBuilder.addTarget(surface); mPreviewRequestBuilder.addTarget(mImageReader.getSurface()); – user1154390 Dec 19 '15 at 14:23

2 Answers2

6

I found an explanation (though not necessarily a satisfactory solution): it turns out that the sensor array's aspect ratio is 16:9 (found via mCameraInfo.get(CameraCharacteristics.SENSOR_INFO_ACTIVE_ARRAY_SIZE);).

At least when requesting YV12/YUV_420_888, the streamer appears to not crop the image in any way, but instead scale it non-uniformly, to reach the requested frame size. The images have the correct proportions when requesting a 16:9 format (of which there are only two higher-res ones, unfortunately). Seems a bit odd to me -- it doesn't appear to happen when requesting JPEG, or with the equivalent old camera API functions, or for stills; and I'm not sure what the non-uniformly scaled frames would be good for.

I feel that it's not a really satisfactory solution, because it means that you can't rely on the list of output formats, but instead have to find the sensor size first, find formats with the same aspect ratio, then downsample the image yourself (as needed)...

I don't know if this is the expected outcome here or a 'feature' of the S5. Comments or suggestions still welcome.

Steffen G.
  • 195
  • 1
  • 9
  • 1
    Same thing happens on the Moto X 2014. I'm still looking at it, but I'm guessing this is the behavior for any device that uses the legacy mode of Camera2. – Tommy Visic Mar 30 '15 at 17:24
  • Good to know, thanks. Can anybody confirm that it _doesn't_ happen with, for example, a Nexus 6 ... ? – Steffen G. Apr 04 '15 at 15:30
  • 1
    It works fine on a Nexus 5, which has the "limited" support level for camera2. – Tommy Visic Apr 06 '15 at 16:00
  • 1
    Hello Steffen, can you please provider a sample of code please? We are trying very hard to make nv21 work. Did you manage to make it work? If not, how did you proceed? – Demian Flavius May 18 '15 at 07:58
  • @DemianFlavius No, I'm not using NV21; YUV_420_888 is working for me as described (when requesting a image size that matches the sensor's aspect ratio and by manually flipping the image (i.e., reading the scanlines in reverse order))... – Steffen G. May 20 '15 at 01:42
  • 1
    I can confirm that the vertical flipping doesn't happen on a Nexus 6 running Marshmallow. – samgak Dec 08 '15 at 06:33
0

I had the same problem and found a solution. The first part of the problem is setting the size of the surface buffer:

    // We configure the size of default buffer to be the size of camera preview we want.
    //texture.setDefaultBufferSize(width, height);

This is where the image gets skewed, not in the camera. You should comment it out, and then set an up-scaling of the image when displaying it.

            int[] rgba = new int[width*height];
            //getImage(rgba);
            nativeLoader.convertImage(width, height, data, rgba);

            Bitmap bmp = mBitmap;
            bmp.setPixels(rgba, 0, width, 0, 0, width, height);

            Canvas canvas = mTextureView.lockCanvas();

            if (canvas != null) {
                //canvas.drawBitmap(bmp, 0, 0, null );//configureTransform(width, height),  null);
                //canvas.drawBitmap(bmp, configureTransform(width, height),  null);
                canvas.drawBitmap(bmp, new Rect(0,0,320,240), new Rect(0,0, 640*2,480*2), null );

                //canvas.drawBitmap(bmp, (canvas.getWidth() - 320) / 2, (canvas.getHeight() - 240) / 2, null);

                mTextureView.unlockCanvasAndPost(canvas);
            }

            image.close();

You can play around with the values to fine tune the solution for your problem.

Lyubomir Dinchev
  • 330
  • 3
  • 12