4

My app manages the preview from the device's camera using the camera2 API. But the problem is that my device is a Nexus 5x, with the flipped sensor and the well-known reverse landscape "issue". I read somewhere that the camera2 api handle this "automatically", but I think this is only true as long as you target the surface of a Surface View object when setting up your capture session. But instead I am targeting a surface built upon a surface texture that I further use to render the preview to get a stereoscopic view, and with this approach the problem persists, I am getting upside down frames. Here is the code, pretty much the conventional workflow when working with the camera2 API.

private void openCamera() {
    CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
    Log.e(TAG, "is camera open");
    try {
        cameraId = manager.getCameraIdList()[CAMERA_SOURCE];
        CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
        StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
        assert map != null;
        imageDimension = map.getOutputSizes(SurfaceTexture.class)[CAMERA_SOURCE];
        // Add permission for camera and let user grant the permission
        if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED && ActivityCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
            ActivityCompat.requestPermissions(MainActivity.this, new String[]{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE}, REQUEST_CAMERA_PERMISSION);
            return;
        }
        manager.openCamera(cameraId, stateCallback, null);

    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
    Log.e(TAG, "openCamera X");
}

private final CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
    @Override
    public void onOpened(CameraDevice camera) {
        //This is called when the camera is open
        Log.e(TAG, "onOpened");
        cameraDevice = camera;
        createCameraPreview();
    }
    @Override
    public void onDisconnected(CameraDevice camera) {
        cameraDevice.close();
    }
    @Override
    public void onError(CameraDevice camera, int error) {
        cameraDevice.close();
        cameraDevice = null;
    }
};

protected void createCameraPreview() {
    try {

        // Create ImageReader Surface
        int max = 2;
        mReader = ImageReader.newInstance(mWidth, mHeight, ImageFormat.YUV_420_888, max);
        ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
            @Override
            public void onImageAvailable(ImageReader mReader) {
                Image image = null;
                image = mReader.acquireLatestImage();
                if (image == null) {
                    return;
                }                           

                byte[] bytes = convertYUV420ToNV21(image);

                nativeVideoFrame(bytes);
                image.close();   
            }
        };      

        mReader.setOnImageAvailableListener(readerListener, mBackgroundHandler);

        // Create Texture Surface
        texture = createTexture();
        mSurfaceTexture = new SurfaceTexture(texture);
        mSurfaceTexture.setOnFrameAvailableListener(this);
        mSurfaceTexture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
        mSurface = new Surface(mSurfaceTexture);

        //Attach surfaces to CaptureRequest
        List<Surface> outputSurfaces = new ArrayList<Surface>(2);
        outputSurfaces.add(mReader.getSurface());
        outputSurfaces.add(mSurface);
        captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
        captureRequestBuilder.addTarget(mSurface);
        captureRequestBuilder.addTarget(mReader.getSurface());

        //Define the capture request
        cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback(){
                    @Override
                    public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
                        //The camera is already closed
                        if (null == cameraDevice) {
                            return;
                        }
                        // When the session is ready, we start displaying the preview.
                        cameraCaptureSessions = cameraCaptureSession;
                        updatePreview();
                    }
                    @Override
                    public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
                        Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
                    }
                }, null);
        } catch (CameraAccessException e) {
             e.printStackTrace();
        }
}

protected void updatePreview() {
    if(null == cameraDevice) {
        Log.e(TAG, "updatePreview error, return");
    }
    captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
    try {
        cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
    } catch (CameraAccessException e) {
        e.printStackTrace();
    }
}

My question is: What should I do to fix the reverse landscape problem by myself? Which line of code and where should I add?

Thanks,

JM

user3417052
  • 141
  • 11

1 Answers1

1

Since you are using a SurfaceTexture and a ImageReader you will have to handle the rotations yourself. Camera2 API handles rotation automatically when used in conjunction with the SurfaceView or TextureView.

That said you can manually rotate the bytes of the frame once you get them via ImageReader.OnImageAvailableListener callback, or even better directly on the GPU performing the action on the OpenGL Texture.

Note that rotation 180° is equivalent to flip bytes once vertically and once horizontally, which on OpenGL means that can either:

  • Rotate 180° the plane drawing the camera texture
  • Scale by -1 the x and y of the plane drawing the camera texture
  • Change the UV of the camera plane
Josef Grunig
  • 442
  • 4
  • 12
  • 1
    With a SurfaceTexture, you need to use the value from https://developer.android.com/reference/android/graphics/SurfaceTexture.html#getTransformMatrix(float[]) to read from the GL texture; then you'll get the right orientation (this is how TextureView gets it right) with camera2. With ImageReader, you need to look at the camera device's sensor orientation https://developer.android.com/reference/android/hardware/camera2/CameraCharacteristics.html#SENSOR_ORIENTATION and access the pixels in the right order. – Eddy Talvala Dec 15 '17 at 22:56