1

I managed to compile ffmpeg libs for Android and i am able to load them in my jni Android app. Now started calling the ffmpeg functions and following the online tutorials. I read there that I need also SDL port for Android in order to render audio/video.

I have looked at libsdl and it seems quite complex to use on Android, especially since I just need sequential render of audio/video media samples, is there some simpler solution (with examples) how to render the decoded buffers? Should I pass decoded media buffers from frrmpeg back to java for rendering?

If I have to use SDL, is there some tutorial how to easy integrate it on Android with ffmpeg?

genpfault
  • 51,148
  • 11
  • 85
  • 139
Sasha Nikolic
  • 760
  • 1
  • 8
  • 22

2 Answers2

1

There are some suggestions:

The ffmpeg decodes video pix_fmt is yuv420p, you need to convert it to BGR32 or RGB565. If you plan use the Bitmap and SurfaceView to render video frame.

If your devices Android 2.2+, there are c functions to copy bitmap data to Bitmap. By geting the pointer using AndroidBitmap_lockPixels and AndroidBitmap_unlockPixels.

If your devices don't support that way, you can put binary data to ByteBuffer and use Bitmap.copyPixelsFromBuffer().

If your SurfaceView will scale larger, you should handle anti-alias problem:

Can u ? How to anti alias for SurfaceView in android?

If OpenGL is the prefered solution, GLThread may consume more cpu time, use draw on dirty is better.

The topic is so interesting. :)

Community
  • 1
  • 1
qrtt1
  • 7,746
  • 8
  • 42
  • 62
0

I think dolphin-player (http://code.google.com/p/dolphin-player/) for android used libSDL

hamlet
  • 356
  • 1
  • 4
  • 11