In general, Android
has all the requisite functions built into it's framework which can be accessed from both native
as well as Java
layers. All component names below have a Java
abstraction and Native
implementation which you can consider using in your project accordingly. For your questions,
1) I would recommend that you could use MediaPlayer
directly as found in this JNI
implementation, actual implementation of which can be found here.
However, should you require to build your own pipeline, I would recommend employing the available blocks such as MediaExtractor
for demuxing and providing separate tracks, OMXCodec
for the codecs and I presume you would manage the audio
routing and Surface
handling for video in your implementation. A good reference for you could be the SimplePlayer
example implementation.
2)If you wish to employ a demuxer
from FFMPEG
, you will have to integrate the same into the list of MediaExtractors
supported by the system. Since your primary interest is MP4
and AVI
, both of which are supported, I feel you can avoid this effort. If you are still interested in the FFMPEG
integration, please raise a separate query as the answer is quite elaborate.
3) I feel this statement may not entirely be true. NDK
can support stagefright
if required, but philosophy is different here. For playback, NDK
supports an OpenMAX
AL
player application as found in this html
file along with an example here. However, please note that only MPEG-2 TS
is supported in OpenMAX AL
implementation which may not suit your requirements.