0

I am trying to develop a video chat application using the h264 encoder for video and MediaCodec Lib. The video should be shown on both client and server. Now I need a tutorial to learn about it. It means that how can I show my camera video on my device and how to send this video to an ip (peer-to-peer chat)?

Thanks

mori
  • 42
  • 8
  • Welcome to stackoverflow.com! what you have tried so far? – Sankar V Feb 21 '14 at 10:23
  • I'm not familar with MediaCodec. I created a class as this link (http://stackoverflow.com/questions/21232206/raw-h-264-stream-output-by-mediacodec-not-playble) – mori Feb 21 '14 at 13:04

1 Answers1

2

you can try to use standard camera api to show frames on the SurfaveView and get frames from camera, encode them and send to network. you can find all details about it at stackoverflow, below are the main ideas

open camera:

     c = Camera.open(index);

set parameters to camera and set surface to show on screen throug surfaceholder:

     /*
        init paramerers firstly, like fps, framerate range, color format
     */

    surfaceHolder = surfaceView.getHolder();
    surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    surfaceHolder.addCallback(this);    

    c.setPreviewDisplay(surfaceHolder);

receive frame from camera:

    private Camera.PreviewCallback previewListener = new Camera.PreviewCallback() 
    {
        @Override
        public void onPreviewFrame(byte[] data, Camera camera) 
        {            
            //put **data** to encoder
        };
    }

put data to encoder like described in http://developer.android.com/reference/android/media/MediaCodec.html

use ffmpeg for streaming. init rtsp connection and send encoded frames to ffmpeg muxer\streamer

    /*
           outputBufferIndex is a buffer encoder prepared for output
    */
    ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
    byte[] outData = new byte[bufferInfo.size];
    outputBuffer.get(outData);

    //something what sends data to ffmeg muxer\network           
    writeVideoFrame(outData, outData.length, bufferInfo.presentationTimeUs);    

the one tricky thing is to build ffmpeg for android and create jni level to send data from java level and corresponding level. i remember there were some prebuilt ffmpeg binaries for android and also there are instructions how to build it.

Marlon
  • 1,473
  • 11
  • 13
  • My problem is how to put data to mdiacodec lib? Can u give me a full sample? – mori Feb 21 '14 at 19:28
  • hey, mediacodec page http://developer.android.com/reference/android/media/MediaCodec.html has very good sample code:) simply change createDecoderByType to encoder's creation and it should work. load input frame from camera and send it to muxer as i wrote above – Marlon Feb 22 '14 at 09:09
  • also if you target your app for android 4.3 and above you can check fadden's page, it has a lot of samples for encoder and other mediacodec's stuff: http://bigflake.com/mediacodec/ – Marlon Feb 22 '14 at 09:12
  • 1
    I recommend against using the `MediaCodec` doc as a template. It's lacking in several ways (too few parameters to `dequeueOutputBuffer()`, no handling of end-of-stream -- the loop will never exit, omits important details about buffer handling, etc). bigflake has small self-contained examples and an FAQ list, Grafika (https://github.com/google/grafika) has some slightly broader examples. None of it really covers audio though, which is a separate barrel of fun, especially when you get into A/V sync during playback. – fadden Feb 22 '14 at 20:03