1

I have seen the below example for encode/decode using MediaCodec API. https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/EncodeDecodeTest.java

In which there is a comparsion of the guessed presentation time and the presentation time received from decoded info.

assertEquals("Wrong time stamp", computePresentationTime(checkIndex),
    info.presentationTimeUs);

Because the decoder just decode the data in encoded buffer, I think there is any timestamp info could be parsed in this encoder's output H.264 stream.

I am writing an Android application which mux a H264 stream (.h264) encoded by MediaCodec to mp4 container by using ffmpeg (libavformat). I don't want to use MediaMuxer because it require version 4.3 which is too high.

However, ffmpeg seems not recognize the presentation timestamp in a packet encoded by MediaCodec, so I always get NO_PTS value when try to read a frame from the stream.

Anyone know how to get the correct presentation timestamp in this situation?

Ganesh
  • 5,880
  • 2
  • 36
  • 54
Phan Lac Phuc
  • 87
  • 1
  • 7
  • 1
    The basic problem is that the H.264 stream doesn't include presentation time stamps. They have to be delivered out of band, or written to a wrapper (e.g. the .mp4 file wrapper that MediaMuxer creates). – fadden May 20 '14 at 14:56
  • But why the decoded info from decoder's queue show the correct timestamp? – Phan Lac Phuc May 21 '14 at 18:18
  • 1
    In the EncodeDecodeTest code, the PTS is generated by `computePresentationTime()`, passed to the encoder through `queueInputBuffer()` or `eglPresentationTime()`, received with the output buffer in BufferInfo, passed to the decoder with the input buffer with `queueInputBuffer()`, and then received with the output in the BufferInfo. If you look at VideoChunks in the DecodeEditEncodeTest, you can see it saving three pieces for each frame (encoded data, flags, and PTS). The timestamp is associated with a buffer while inside the codec, but it's simply passed through. – fadden May 21 '14 at 20:46
  • Thank you fadden. Now I know how the `MediaCodec` encoder/decoder work. – Phan Lac Phuc May 22 '14 at 01:18

1 Answers1

3

to send timestamps from MediaCodec encoder to ffmpeg you need to convert like that:

jint Java_com_classclass_WriteVideoFrame(JNIEnv * env, jobject this, jbyteArray data, jint datasize, jlong timestamp) {

    ....

AVPacket pkt;
av_init_packet(&pkt);

AVCodecContext *c = m_pVideoStream->codec;

pkt.pts = (long)((double)timestamp * (double)c->time_base.den / 1000.0);
pkt.stream_index    = m_pVideoStream->index;
pkt.data            = rawjBytes;
pkt.size            = datasize;

where time_base depends on framerate

upd re timestamps flow in pipline: neither decoder nor encoder knows time-stamps by their own. timestamps are set to these components via

decoder.queueInputBuffer(inputBufIndex, 0, info.size, info.presentationTimeUs, info.flags);

or

encoder.queueInputBuffer(inputBufIndex, 0, 0, ptsUsec, info.flags);

these timestamps could be taken from extractor, from camera or generated by app, but decoder\encoder just passes through these time-stamps without changing them. as a result time-stamps go unchanged from source to sink (muxer).

for sure there are some exclusions: if frames frequency in changed - frame rate conversion for example. or if encoder makes encoding with B-frames and reordering happens. or encoder can add time-stamps to the encoder frame header - optional, not mandatory by standard. i think all of this is not applied to current android version, codecs or your usage scenario.

Marlon
  • 1,473
  • 11
  • 13
  • 1
    Thanks for your suggestion. I know how to send timestamps from `MediaCodec` encoder to `ffmpeg` but don't know why only the `MediaCodec` decoder be able to get the timestamp info from encoded data? – Phan Lac Phuc May 21 '14 at 18:47