I managed to compile ffmpeg libs for Android and i am able to load them in my jni Android app. Now started calling the ffmpeg functions and following the online tutorials. I read there that I need also SDL port for Android in order to render audio/video.
I have looked at libsdl and it seems quite complex to use on Android, especially since I just need sequential render of audio/video media samples, is there some simpler solution (with examples) how to render the decoded buffers? Should I pass decoded media buffers from frrmpeg back to java for rendering?
If I have to use SDL, is there some tutorial how to easy integrate it on Android with ffmpeg?