3

I want to work with the bit array of each incoming frame, basically tapping to the YUV format received from the camera sensor for each frame and do some processing on it. I'm new to java/android and learning as I go so some of my questions are pretty basic but I couldn't find any answers that suites my needs.

Q1: How do I get a bit array of each frame received by the camera sensor? (how do I save the YUV byte stream for further use?)
Q2: How to set that for each new frame received a new data array will be received for processing?
Q3: Do I have to set a preview to do that or could I tap straight to a buffer holding the raw data from the open camera?
Q4: Will a preview slow down the process (of receiving new frames)?

Some further explanations if needed: The idea is to create one way communication with a flickering LED light and a smartphone, by pointing the phones camera to the LED, a real-time process will register the slight changes and decode them to the original sent data. to do so I plan to receive the YUV data for each frame, strip it to the Y part and decide for each frame if the light is on or off.

Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
Voly
  • 145
  • 4
  • 16

2 Answers2

3

Yes, that's the Camera API. Android 21 and newer support camera2 API which can give you faster response, but this depends on device. I would recommend still to use the deprecated older API if your goal is maximum reach.

Usually, Android camera produces NV21 format, from which it is very easy to extract the 8bpp luminance.

Android requires live preview if you want to capture camera frames. There are quite a few workarounds to keep the preview hidden from the end user, but this is not supported, and any such trick may fail on the next device or on the next system upgrade. But no worry: live preview does not delay your processing speed at all, because it is all done in a separate hardware channel.

All in all, you can expect to receive 30 fps on average device when you use Camera.setPreviewCallbackWithBuffer() and do everything correctly. The high-end devices that have full implementation of camera2 api may deliver higher frame rates. Samsung published their own camera sdk. Use it if you need some special features of Samsung devices.

On a multi-core device, you can offload image processing to a thread pool, but still the frame rate will probably be limited by camera hardware.

Note that you can perform some limited image processing in GPU, applying shaders to the texture that is acquired from camera.

Community
  • 1
  • 1
Alex Cohn
  • 56,089
  • 9
  • 113
  • 307
2

Assuming you have done the basics and you have got a camera preview and a Camera object. You can call:

Camera.PreviewCallback callback = new Camera.PreviewCallback() {
    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        //Do your processing here... Use the byte[] called "data"
    }
};

And then:

mCamera.setPreviewCallback(callback);

If Camera.Parameters.setPreviewFormat() is never called, the default format will be NV21, which uses the NV21 encoding format.

Endre Börcsök
  • 477
  • 1
  • 8
  • 19