6

I am developing H264 H/W accelerated video decoder for android. So far, I've come around with some libraries MediaCodec, Stagefright, OpenMax IL, OpenMax AL and FFmpeg. After a bit research, I've found that -

  1. I found a great resource of using stagefright with FFmpeg, but I can not use FFmpeg as for its license, it is quite restrictive for distributed software. (Or possible to discard FFmpeg from this approach?)

  2. I can not use MediaCodec as its a Java API and I have to call it via the JNI from C++ layer which is relatively slow and I am not allowed.

  3. I can not use OpenMax AL as it only supports the decoding of MPEG-2 transport stream via a buffer queue. This rules out passing raw h264 NALUs or other media formats for that matter.

  4. Now only left are - stagefright and OpenMax IL. I came to know that stagefright uses OpenMax(OMX) interface. So should I go with stagefright or OpenMax IL? Which will be more promising?

Also, I came to know that Android H/W accelerated decoder is vendor specific and every vendors has their own OMX interfacing APIs. Is it true? If so, do I need to write H/W vendor specific implementation incase of OpenMax IL? What about stagefright? - Is it hardware agnostic or hardware dependent? If there is no way of H/W indenpent implementation using stagefright or OpenMax IL, I need to support at least Qualcomm's Snapdragon, Samsung's Exynos and Tegra-4.

Note that, I need to decode H264 Annex B stream and expect decoded data after decode which I will send to my video rendering pipeline. So basically, I only need the decoder module.

I am really confused a lot. Please help me putting in right direction. Thanks in advance!

EDIT

My software is for commercial purpose and the source code is private as well. And I am also restricted to use ffmpeg by client. :)

Community
  • 1
  • 1
Kaidul
  • 15,409
  • 15
  • 81
  • 150
  • Possible duplicate of [How to use hardware accelerated video decoding on Android?](http://stackoverflow.com/questions/11321825/how-to-use-hardware-accelerated-video-decoding-on-android) – Ciro Santilli OurBigBook.com Feb 22 '16 at 20:42

2 Answers2

3

You really should go for MediaCodec. Calling java methods via JNI does have some overhead, but you should keep in mind what order of magnitude the overhead is. If you'd call a function per pixel, the overhead of JNI calls might be problematic. But for using MediaCodec, you only do a few function calls per frame, and the overhead there is negligible.

See e.g. http://git.videolan.org/?p=vlc.git;a=blob;f=modules/codec/omxil/mediacodec_jni.c;h=57df9889c97706436823a4960206e323565e221c;hb=b31df501269b56c65327be181cdca3df48946fb1 as an example on using MediaCodec from C code using JNI. As others also have gone this way, I can assure you that the JNI overhead is not a reason to consider other APIs than MediaCodec.

Using stagefright or OMX directly is problematic; the ABI differs between each platform version (so you can either only target one version, or compile multiple times targeting different versions, packaging it all up in one package), and you'd have to deal with a lot of device specific quirks, while MediaCodec should (and on modern versions does) work the same across all devices.

mstorsjo
  • 12,983
  • 2
  • 39
  • 62
  • Thanks for your answer! I was suggested MediaCodec in several threads but I did avoid the advice. Now I am going to do a quick test and see how the performance. – Kaidul Sep 07 '15 at 11:21
  • I saw the vlc mediacodec jni wrapper used OMX specific headers, do I need them to just call mediacodec Java layer methods in native layer? – Kaidul Oct 13 '15 at 05:37
  • 1
    You don't need the OMX specific headers, but in VLC they are used as a convenience since there is some shared code. The only thing shared is the handling of interpreting a MediaCodec color format, which happens to be the same as for OMX. (That is, `MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar` is equivalent to `OMX_COLOR_FormatYUV420SemiPlanar`; there's shared code for copying data from MediaCodec/OMX buffers into VLC's own pictures, with different codepaths for different color formats.) – mstorsjo Oct 13 '15 at 06:07
  • Thanks! There is a function called `MediaCodec_getName()` http://git.videolan.org/?p=vlc.git;a=blob;f=modules/codec/omxil/mediacodec_jni.c;h=57df9889c97706436823a4960206e323565e221c;hb=b31df501269b56c65327be181cdca3df48946fb1#l285 which contains all the `OMX*` stuffs. Can I skip it by calling `MediaCodec.getName()` java lower level API? – Kaidul Oct 13 '15 at 09:38
  • The `MediaCodec_getName()` function you linked to is a more elaborate version of doing `MediaCodec.createDecoderByType`, by manually iterating over `MediaCodecList`. This allows to skip some decoders that are known to not work for VLC's usecase. – mstorsjo Oct 13 '15 at 11:24
  • Thank you! One last question - Can I wrap all the `MediaCodec` methods with a custom Java method and call the custom method using JNI? I mean - all mediacodec functions will be called inside the function then. Is there any difference between this approach and calling mediacodec functions one by one using JNI? – Kaidul Oct 14 '15 at 03:28
  • That should be just as good, there's really not any difference. The overhead of calling JNI methods isn't so big as it would matter, but it probably can make your code a bit cleaner and easier to read. – mstorsjo Oct 14 '15 at 07:08
2

I found a great resource of using stagefright with FFmpeg, but I can not use FFmpeg as for its license, it is quite restrictive for distributed software. (Or possible to discard FFmpeg from this approach?)

That's not true. FFmpeg is LGPL, so you can just use it in your commercially redistributable application.

However, you might be using modules of FFmpeg which are GPL licensed, e.g. libx264. In that case, your program must be GPL-compliant.

But not even that is bad for distributing software -- it just means that you need to give your customers (who should be kings, anyway), access to the source code of the application they are paying for, and are not allowed to restrict their freedoms. Not a bad deal, IMHO.

Also, I came to know that Android H/W accelerated decoder is vendor specific and every vendors has their own OMX interfacing APIs. Is it true?

Obviously, yes. If you need hardware acceleration, someone has to write a program that makes your specific hardware accelerate something.

Marcus Müller
  • 34,677
  • 4
  • 53
  • 94