0

I am using media codec for encoding frame coming from camera and render it using gl surface view.

my ondrawframe looks like this

public void onDrawFrame(GL10 unused)
{
        float[] mtx = new float[16];
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
        surface.updateTexImage();
        surface.getTransformMatrix(mtx); 

        mDirectVideo.draw(surface);
        saveRenderState();

        delegate.mInputSurface.makeCurrent();
        mDirectVideo.draw(surface);
        delegate.swapBuffers();
        restoreRenderState();

}

So here we are doing Draw(surface) two times which will render to surface. This will make overhead in system. Is there any where i can do Ondraw once only ? two times using shadder is costly operation Is there any way we can share the surfaces between render and encoder?

fadden
  • 51,356
  • 5
  • 116
  • 166
Nehal Shah
  • 93
  • 2
  • 11

1 Answers1

2

If your draw() function is expensive -- you're rendering a complex scene in addition to blitting the video frame -- you can render to a texture using an FBO, and then just blit that texture twice. If your draw() is primarily just the video texture blit, then you can't make it any faster.

The bottom line is that you're rendering to two different surfaces, and there's currently (Android 4.4) no way to send the same buffer to two different consumers. The hardware on recent devices should have no trouble keeping up.

(Rendering the screen and encoded video is somewhat limiting anyway, unless you're recording the screen and so want the display size and video size to be exactly the same. It's usually convenient to have the on-screen display fit into the UI, while the encoded video matches what's coming out of the camera.)

BTW, watch out for this issue.

Update: Grafika now includes an example of drawing + recording using both methods (draw twice, draw to FBO and blit). See RecordFBOActivity.

Community
  • 1
  • 1
fadden
  • 51,356
  • 5
  • 116
  • 166
  • Thanks fadden... My draw code is same as drawframe of Bigflake MediaCodec example. I am not clear how to use FBO but i will try to figure out... If u can provide some pointer it will be great.. – Nehal Shah Dec 12 '13 at 16:23
  • If your shaders and scene are that simple, there's no advantage to drawing off-screen. For the FBO approach you render to an off-screen texture, and then blit that texture twice; if your rendering is little more than the video frame blit, then you're not gaining much. There's a good example of FBO use (GLES 1.x) here: http://alvinalexander.com/java/jwarehouse/android-examples/samples/android-9/ApiDemos/src/com/example/android/apis/graphics/FrameBufferObjectActivity.java.shtml – fadden Dec 12 '13 at 18:48
  • Thanks for fadden for valuable input. I think it makes sense for me not to use FBO as it is not complex frame. – Nehal Shah Jan 07 '14 at 14:48