0

I am working on development for simultaneous camera streaming and the recording using MediaCodec API. I want to merge the frame from both the camera and give to rendering as well as to Mediacodec for recording as surface. I do not want to create multiple EGLContext Rather same should be used across. I am taking Bigflake media codec example as reference however i am not clear whether it is possible or not. Also how to bind the multiple textures? As we require two textures for two camera. Your valueable input will help me to progress further. Currently i am stuck and not clear what to do further.

regards Nehal

Nehal Shah
  • 93
  • 2
  • 11
  • 1
    You shouldn't need multiple EGL contexts. I think you just need two SurfaceTextures with different texture names, and then either render in two steps (bind texture, draw, bind other texture, draw) or use multitexturing. I'm not sufficiently familiar with Camera to know how well recording both cameras at once will work. – fadden Dec 04 '13 at 20:26
  • 1
    The "continuous capture" activity in Grafika (https://github.com/google/grafika) demonstrates simultaneous display and recording with a single EGL context. Opening front/back cameras simultaneously appears to be possible only on specific devices. – fadden May 20 '14 at 20:03

0 Answers0