7

Problem: I have video stream from GoPro camera, in the .m3u8 format. I need to display the content of the stream in application and then stream the video. For streaming, I have library that works with MediaCodec and ffmpeg. I should be able to stream the outputBuffer of MediaCodec.

I can stream the video with MediaPlayer on SurfaceView, no problem. The real struggle comes with using MediaCodec. I can initialize the MediaCodec, get the surface and add it to the MediaPlayer - the steam plays - I can hear the sound.

Now the questions:

  1. display the stream on the SurfaceView - or any other view that MP and MC can work with. What is the best way to display data from Surface to SurfaceView/TextureView?

  2. I am not able get the data from MediaCodec. I only need to support 5.0+ devices, so I tried to use MediaCodec.Callback. But I get error every time. I haven't found the meaning of the error. What can I do to mate the Callback work?

  3. Is there any "easy" way of doing this, or do I have to go deep into the buffers and surfaces as in Google Grafika?

Here are the errors I get:

E/ACodec﹕ [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -2147483648

 1877-1904/com.example.marek.goprorecorder E/ACodec﹕ [OMX.qcom.video.encoder.avc] ERROR(0x80001009)
07-15 15:54:47.411    1877-1904/com.example.marek.goprorecorder E/ACodec﹕ signalError(omxError 0x80001009, internalError -2147483648)
07-15 15:54:47.411    1877-1903/com.example.marek.goprorecorder E/MediaCodec﹕ Codec reported err 0x80001009, actionCode 0, while in state 6

Errror mess: android.media.MediaCodec$CodecException: Error 0x80001009
07-15 15:54:47.454    1877-1877/com.example.marek.goprorecorder D/MediaCodecCallback﹕ Errror diag: android.media.MediaCodec.error_neg_2147479543
07-15 15:54:47.454    1877-1877/com.example.marek.goprorecorder D/MediaCodecCallback﹕ Errror rec: false
07-15 15:54:47.454    1877-1877/com.example.marek.goprorecorder D/MediaCodecCallback﹕ Errror tra: false

And here is my Activity:

public class MainActivity extends ActionBarActivity implements MediaPlayer.OnPreparedListener{
    private final MediaPlayer mp = new MediaPlayer();
    private Surface encoderSurface;
    private static final String MIME_TYPE = "video/avc";    // H.264 Advanced Video Coding
    private static final int FRAME_RATE = 30;               // 30fps
    private static final int IFRAME_INTERVAL = 0;           // 5 seconds between I-frames
    private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();
    MediaCodec encoder = null;
    MediaCodec.BufferInfo mBufferInfo;
    TextureView surfaceView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        int width = 432;
        int height = 240;
        prepareEncoder(width, height, 600000);
    }

    private void prepareEncoder(int width, int height, int bitRate) {
        mBufferInfo = new MediaCodec.BufferInfo();
        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

        try {
            encoder = MediaCodec.createEncoderByType(MIME_TYPE);
        } catch (IOException e) {
            e.printStackTrace();
        }
        CodecCallback callback = new CodecCallback();
        encoder.setCallback(callback);
        encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        encoderSurface = encoder.createInputSurface();
        preparePlayer(encoderSurface);
        encoder.start();
    }

    void preparePlayer(Surface s) {
        try {
            //mp.setDataSource("http://10.5.5.9:8080/live/amba.m3u8");
            mp.setDataSource("http://playertest.longtailvideo.com/adaptive/oceans_aes/oceans_aes.m3u8");
            mp.setSurface(s);
            mp.prepare();
            mp.setOnPreparedListener(this);
        } catch (IllegalArgumentException e) {
            e.printStackTrace();
        } catch (IllegalStateException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

    @Override
    public void onPrepared(MediaPlayer mp) {
        mp.start();
    }

    private class CodecCallback extends Callback {

        @Override
        public void onInputBufferAvailable(MediaCodec codec, int index) {
            Log.d("MediaCodecCallback","InputBufferAvailable "+index);
        }

        @Override
        public void onOutputBufferAvailable(MediaCodec codec, int index, MediaCodec.BufferInfo info) {
            ByteBuffer outBuffer = codec.getOutputBuffer(index);
            Log.d("MediaCodecCallback", "OutputBuffer Position: " + outBuffer.position());
            Log.d("MediaCodecCallback", "OutputBuffer Limit: " + outBuffer.limit());
            Log.d("MediaCodecCallback", "OutputBuffer Capacity: " + outBuffer.capacity());
            Log.d("MediaCodecCallback","OutputBuffer: "+outBuffer.toString());
            encoder.releaseOutputBuffer(index, false);
        }

        @Override
        public void onError(MediaCodec codec, MediaCodec.CodecException e) {
            Log.d("MediaCodecCallback","Errror mess: " + e);
            Log.d("MediaCodecCallback","Errror diag: " + e.getDiagnosticInfo());
            Log.d("MediaCodecCallback","Errror rec: " + e.isRecoverable());
            Log.d("MediaCodecCallback","Errror tra: " + e.isTransient());
        }

        @Override
        public void onOutputFormatChanged(MediaCodec codec, MediaFormat format) {
            Log.d("MediaCodecCallback","OutputFormatChanged: " + format.toString());
        }
    }


}

Edit 1:

Is this the right way to add the output to SurfaceTexture?

@Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        Surface s = new Surface(surface);
        preparePlayer(s); //ads Surface to MediaPlayer and starts to play the stream
    }

And can I use the surfaceTexture I get from onSurfaceTextureUpdate?

@Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {           

    }
  • another question, do you have example how to render the SurfaceTexture to Surface? Thank you!
kvgr
  • 325
  • 4
  • 17
  • 2
    Frames sent to SurfaceView are gone -- they're consumed by SurfaceFlinger, in a different process. If you want to send frames to two different destinations, i.e. the on-screen viewer and the encoder, then you have to send the output to a SurfaceTexture and render that on the two different Surfaces. (Internally, 5.x BufferQueues can multiplex, but I don't believe that's exposed in the API yet.) – fadden Jul 15 '15 at 18:08
  • Thank you. But why are the frames consumed by SurfaceFlinger? They are not displayed in any view. They are just in surface that is produced by MediaCodec. – kvgr Jul 16 '15 at 07:20
  • 3
    Frames sent to SurfaceView are consumed whether the Surface is visible or not. Bear in mind that a "Surface" is a queue of system-allocated buffers with a producer-consumer interface, and frames are passed around by reference... it's not just a buffer in memory that you can read and write from. If you give the SurfaceView's Surface to MediaCodec, they're going to SurfaceFlinger. If you give a SurfaceTexture's Surface to MediaCodec, they will stay in-process, because SurfaceTexture acts as the consumer (converting the frames to GLES textures). – fadden Jul 16 '15 at 15:39
  • Thank you! It took me a while to understand it, but thinks started working! – kvgr Jul 23 '15 at 13:39
  • 1
    @fadden, I hope you are writing a book or two. – Dominic Cerisano Dec 14 '15 at 05:37
  • @kvgr/@fadden: Any chance you can provide the solution? I am running into the same problem. I can play the video to a surface but I cannot play the video to the encoder's surface (to encode it and view it at the same time). I understand I'll need to rely on a SurfaceTexture but how would I render it to two different surfaces? – John Smith May 14 '19 at 22:09
  • It seems I can use a SurfaceTexture-backed Surface for MediaPlayer. Then somehow get those frames and push them to the encoder's Surface (obtained via createInputSurface). My question is specifically, how do I get the frames from the SurfaceTexture and push them to the encoder's Surface? – John Smith May 14 '19 at 22:36

0 Answers0