Problem: I have video stream from GoPro camera, in the .m3u8 format. I need to display the content of the stream in application and then stream the video. For streaming, I have library that works with MediaCodec and ffmpeg. I should be able to stream the outputBuffer of MediaCodec.
I can stream the video with MediaPlayer on SurfaceView, no problem. The real struggle comes with using MediaCodec. I can initialize the MediaCodec, get the surface and add it to the MediaPlayer - the steam plays - I can hear the sound.
Now the questions:
display the stream on the SurfaceView - or any other view that MP and MC can work with. What is the best way to display data from Surface to SurfaceView/TextureView?
I am not able get the data from MediaCodec. I only need to support 5.0+ devices, so I tried to use MediaCodec.Callback. But I get error every time. I haven't found the meaning of the error. What can I do to mate the Callback work?
Is there any "easy" way of doing this, or do I have to go deep into the buffers and surfaces as in Google Grafika?
Here are the errors I get:
E/ACodec﹕ [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -2147483648
1877-1904/com.example.marek.goprorecorder E/ACodec﹕ [OMX.qcom.video.encoder.avc] ERROR(0x80001009)
07-15 15:54:47.411 1877-1904/com.example.marek.goprorecorder E/ACodec﹕ signalError(omxError 0x80001009, internalError -2147483648)
07-15 15:54:47.411 1877-1903/com.example.marek.goprorecorder E/MediaCodec﹕ Codec reported err 0x80001009, actionCode 0, while in state 6
Errror mess: android.media.MediaCodec$CodecException: Error 0x80001009
07-15 15:54:47.454 1877-1877/com.example.marek.goprorecorder D/MediaCodecCallback﹕ Errror diag: android.media.MediaCodec.error_neg_2147479543
07-15 15:54:47.454 1877-1877/com.example.marek.goprorecorder D/MediaCodecCallback﹕ Errror rec: false
07-15 15:54:47.454 1877-1877/com.example.marek.goprorecorder D/MediaCodecCallback﹕ Errror tra: false
And here is my Activity:
public class MainActivity extends ActionBarActivity implements MediaPlayer.OnPreparedListener{
private final MediaPlayer mp = new MediaPlayer();
private Surface encoderSurface;
private static final String MIME_TYPE = "video/avc"; // H.264 Advanced Video Coding
private static final int FRAME_RATE = 30; // 30fps
private static final int IFRAME_INTERVAL = 0; // 5 seconds between I-frames
private static final File OUTPUT_DIR = Environment.getExternalStorageDirectory();
MediaCodec encoder = null;
MediaCodec.BufferInfo mBufferInfo;
TextureView surfaceView;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
int width = 432;
int height = 240;
prepareEncoder(width, height, 600000);
}
private void prepareEncoder(int width, int height, int bitRate) {
mBufferInfo = new MediaCodec.BufferInfo();
MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, bitRate);
format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);
try {
encoder = MediaCodec.createEncoderByType(MIME_TYPE);
} catch (IOException e) {
e.printStackTrace();
}
CodecCallback callback = new CodecCallback();
encoder.setCallback(callback);
encoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
encoderSurface = encoder.createInputSurface();
preparePlayer(encoderSurface);
encoder.start();
}
void preparePlayer(Surface s) {
try {
//mp.setDataSource("http://10.5.5.9:8080/live/amba.m3u8");
mp.setDataSource("http://playertest.longtailvideo.com/adaptive/oceans_aes/oceans_aes.m3u8");
mp.setSurface(s);
mp.prepare();
mp.setOnPreparedListener(this);
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
@Override
public void onPrepared(MediaPlayer mp) {
mp.start();
}
private class CodecCallback extends Callback {
@Override
public void onInputBufferAvailable(MediaCodec codec, int index) {
Log.d("MediaCodecCallback","InputBufferAvailable "+index);
}
@Override
public void onOutputBufferAvailable(MediaCodec codec, int index, MediaCodec.BufferInfo info) {
ByteBuffer outBuffer = codec.getOutputBuffer(index);
Log.d("MediaCodecCallback", "OutputBuffer Position: " + outBuffer.position());
Log.d("MediaCodecCallback", "OutputBuffer Limit: " + outBuffer.limit());
Log.d("MediaCodecCallback", "OutputBuffer Capacity: " + outBuffer.capacity());
Log.d("MediaCodecCallback","OutputBuffer: "+outBuffer.toString());
encoder.releaseOutputBuffer(index, false);
}
@Override
public void onError(MediaCodec codec, MediaCodec.CodecException e) {
Log.d("MediaCodecCallback","Errror mess: " + e);
Log.d("MediaCodecCallback","Errror diag: " + e.getDiagnosticInfo());
Log.d("MediaCodecCallback","Errror rec: " + e.isRecoverable());
Log.d("MediaCodecCallback","Errror tra: " + e.isTransient());
}
@Override
public void onOutputFormatChanged(MediaCodec codec, MediaFormat format) {
Log.d("MediaCodecCallback","OutputFormatChanged: " + format.toString());
}
}
}
Edit 1:
Is this the right way to add the output to SurfaceTexture?
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
Surface s = new Surface(surface);
preparePlayer(s); //ads Surface to MediaPlayer and starts to play the stream
}
And can I use the surfaceTexture I get from onSurfaceTextureUpdate?
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
- another question, do you have example how to render the SurfaceTexture to Surface? Thank you!