5

I'm trying to capture h264 frames from Android camera (encoded by MediaCodec) and pass them to an FFmpeg process running on the same device.

I currently do it by writing the encoded byte arrays I receive from the MediaCodec to a file called out.h264.

Like so :

    FileOutputStream fosVideo = new ...

    ...

    // encoder callback
    @Override
    public void onVideoData(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
        fosVideo.write(h264Buffer);
    }

While the h264 file is being written to is I start the FFmpeg process and provide the h264 file as input.

ffmpeg -re -i out.h264 -c:v copy -r 30 -loglevel 48 a.mp4

I also tried

ffmpeg -re -framerate 25 -i out.h264 -c:v copy -r 30 -loglevel 48 a.mp4

The FFmpeg process runs anywhere between 10 seconds to a few minutes and then stops abruptly with:

frame=  330 fps= 29 q=31.0 size=     512kB time=00:00:10.98 bitrate= 381.8kbits/s dup=55 drop=0 speed=0.972x    
[h264 @ 0xf1863800] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
[h264 @ 0xf1863b80] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
[h264 @ 0xf1863f00] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
*** 1 dup!
[h264 @ 0xf1864280] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
Clipping frame in rate conversion by 0.199989
[h264 @ 0xf1864600] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
[h264 @ 0xf1862a00] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
[h264 @ 0xf1862d80] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
[h264 @ 0xf1863100] nal_unit_type: 1(Coded slice of a non-IDR picture), nal_ref_idc: 1
*** 1 dup!
Clipping frame in rate conversion by 0.199989
*** 1 dup!
frame=  347 fps= 29 q=31.0 size=     768kB time=00:00:11.53 bitrate= 545.5kbits/s dup=58 drop=0 speed=0.974x    
Clipping frame in rate conversion by 0.199989
[out_0_0 @ 0xf182e1e0] EOF on sink link out_0_0:default.
No more output streams to write to, finishing.
frame=  349 fps= 29 q=24.8 Lsize=     920kB time=00:00:17.68 bitrate= 426.1kbits/s dup=58 drop=0 speed=1.48x    
video:631kB audio:282kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.732886%
Input file #0 (/storage/emulated/0/MOVIES/out.h264):
Input stream #0:0 (video): 291 packets read (6065016 bytes); 291 frames decoded; 
Total: 291 packets (6065016 bytes) demuxed
Input file #1 (/storage/emulated/0/MOVIES/out.aac):
Input stream #1:0 (audio): 830 packets read (289119 bytes); 
Total: 830 packets (289119 bytes) demuxed
Output file #0 (/storage/emulated/0/hls/a.mp4):
Output stream #0:0 (video): 349 frames encoded; 349 packets muxed (645952 bytes); 
Output stream #0:1 (audio): 830 packets muxed (289119 bytes); 
Total: 1179 packets (935071 bytes) muxed
291 frames successfully decoded, 0 decoding errors

Even though the out.h264 file is still being recorded into. It is as if the ffmpeg process thinks that the file has ended.

Any idea what could it be?

Vadim Eksler
  • 865
  • 9
  • 24
Bob Ross
  • 756
  • 6
  • 13
  • 1
    I'm experiencing the same problem – Dubon Ya'ar Jan 24 '19 at 14:17
  • 1
    "While the h264 file is being written..." Sounds like ffmpeg is catching up to the (apparent) end of your file, and getting an EOF. Is there any documentation that indicates it supports using shared files in this way? Have you considered using pipes? – greeble31 Jan 24 '19 at 15:23
  • @greeble31 I use the `-re` option to keep the speed x1. I tried using pipes but I couldn't get it to work. – Bob Ross Jan 24 '19 at 16:16
  • Note that you don't need ffmpeg to mux encoded h264 frames, you have MediaMuxer API that does it better. If you do need ffmpeg, use a pipe and not intermediate file. – Alex Cohn Jan 29 '19 at 11:09
  • 1
    @AlexCohn I need ffmpeg for live streaming, I've piped the video but how can I have another pipe for Audio? – Bob Ross Jan 29 '19 at 13:20
  • So, you actually want to pipe output, too? – Alex Cohn Jan 30 '19 at 08:15
  • @AlexCohn No, the output is fine, I need to pipe 2 separate streams, 1 for audio and 1 for video. Currently we're trying to solve it using udp input for the audio stream and standard pipe for video. – Bob Ross Jan 31 '19 at 09:21
  • Probably I am missing something basic here, but MediaMuxer is capable of [doing all this](https://android.googlesource.com/platform/cts/+/kitkat-release/tests/tests/media/src/android/media/cts/MediaMuxerTest.java). – Alex Cohn Jan 31 '19 at 15:38
  • MediaMuxer packages the 2 stream to a non seekable container (MP4) and that can't be digested by FFmpeg in real time. – Bob Ross Jan 31 '19 at 15:57
  • 1
    You could find some 3rd party libraries, e.g. https://github.com/octiplex/Android-RTMP-Muxer or https://github.com/pedroSG94/rtmp-rtsp-stream-client-java – Alex Cohn Jan 31 '19 at 17:25

0 Answers0