Lately I was working with MediaRecorder to capture videos and handle them in the output. However as it turns out, there were security restrictions, which didn't allow me to catch the outputstream from the MediaRecorder (the problem presented in the link below):
"Seekable" file descriptor to use with MediaRecorder Android 6.0 (API 23)
So I had to elaborate another solution and decided to work with Camer API and and get the stream there. So the first way was to work with onPreviewFrame, catch the frames in a file and convert colors and formats (MediaCodec). Luckely the problem with color conversion could be circumvented by getting the video from the e.g SuraceTexture, as described e.g. in bigflakes project:
https://bigflake.com/mediacodec/CameraToMpegTest.java.txt
I am not a total newbie in Android Java, but this is really overwhelming me. I dont want a ready receipt for that and I am pretty okay with sitting and working the next whole week and cracking that code, but firstly my question is: how you guys got to understand MediaCodec taking the video from e.g. SurfaceTexture and later put it in MediaMuxer and secondly could you recommend some tutorials, where you begin with the simpliest project on that topic and then gradually expand the code?
I really try to work on bigflakes project, but I am helpless even because the onCreate method is missing.. and the best part begins when he begins to render the video.