-2

I'm writing an Android application in C++, and want to get a GraphicBuffer from a block of memory that contains a YUV420sp image. Specifically, I have an IMemory from the camera's dataCallbackTimestamp callback function that gives me a block of memory with a video frame's image, and I want to add it to a BufferQueue without doing a memcpy. I'm using HAL, and do not have HAL3 or Camera2 available on my client's HW (which would make this trivial).

In particular, how do I create a zero-copy ANativeWindowBuffer or a GraphicBuffer out of a void* ? I need to be able to map 30 such GraphicBuffers per second for 4K video. I've scoured the internet and examples, but cannot figure out how to do this without a memcpy (which kills my framerate).

I can handle pixel formats, etc., but just need help creating the actual GraphicBuffer from memory.

capitalr
  • 9
  • 3

2 Answers2

2

The short answer is you can't. The memory behind a GraphicBuffer must be allocated by the OS based on the requirements of the hardware units (gpu, camera, video codec, display, etc.) that need to access it, must be securely shareable across processes, etc. A void* doesn't meet those requirements.

I'm not familiar with the camera APIs, but what you want to do is obtain a Surface from whatever system will consume the buffers, and provide that Surface to the camera APIs for them to generate buffers to. Underneath, this will allocate a set of GraphicBuffers compatible with both the producer (camera) and consumer, and stream frames through them with zero copies if the hardware is capable of that.

Jesse Hall
  • 6,441
  • 23
  • 29
0

Adding to what Jesse says, IMemory is a different shared memory buffer (ashmem-based) than GraphicBuffer, and they're not directly compatible.

Also, by directly hooking into dataCallbackTimestamp, you're stepping outside of the public camera API into implementation details. Those are not guaranteed to remain the same release-to-release, so you're risking breaking your application on future (or past) OS releases by using it.

Since you're using the deprecated camera API (well, the internals of it) and the old HAL, there's not much you can do here without a memcpy of some sort.

You can try to pass a GPU SurfaceTexture as preview to the camera API and then draw the 4K textures in EGL to a Surface from a MediaCodec (assuming you're trying to encode video, anyway - not sure where your GraphicBuffers are going).

Eddy Talvala
  • 17,243
  • 2
  • 42
  • 47