0

I am trying to use the new MediaSync API to play video and Audio in sync . https://developer.android.com/reference/android/media/MediaSync.html

I have the Audio Only player working using MediaSync,but for the Video only player, I get this in the logcat right after 5-6 frames are displayed

I/MediaSync﹕ still waiting to release a buffer before acquire

My Video only player is as below

public class VideoDecoderTask implements Runnable {
......
@Override
public void run() {

    mediaSync = new MediaSync();
    mediaSync.setSurface(surface);
    Surface inputSurface = mediaSync.createInputSurface();

    mediaExtractor = new MediaExtractor();

    try {
        mediaExtractor.setDataSource(this.clipPath);
    } catch (IOException e1) {
        e1.printStackTrace();
    }

    for (int i = 0; i < mediaExtractor.getTrackCount(); i++) {
        MediaFormat format = mediaExtractor.getTrackFormat(i);
        String mime = format.getString(MediaFormat.KEY_MIME);
        if (mime.startsWith("video/")) {
            Log.d(LOG_TAG, format.toString());
            mediaExtractor.selectTrack(i);
            try {
                videoDecoder = MediaCodec.createDecoderByType(mime);
            } catch (IOException e) {
                e.printStackTrace();
            }
            videoDecoder.configure(format, inputSurface, null, 0);
            Log.d(LOG_TAG, "Found a video track.");
            break;
        }
    }

    SyncParams syncParams = new SyncParams();
    syncParams.setSyncSource(SyncParams.SYNC_SOURCE_VSYNC);
    mediaSync.setPlaybackParams(new PlaybackParams().setSpeed(1.f));
    mediaSync.setSyncParams(syncParams);

    videoDecoder.setCallback(decoderCallback, null);
    videoDecoder.start();
}

MediaCodec.Callback decoderCallback = new MediaCodec.Callback() {
    @Override
    public void onInputBufferAvailable(MediaCodec codec, int index) {
        if (index >= 0) {
            ByteBuffer byteBuffer = codec.getInputBuffer(index);
            int sampleSize = mediaExtractor.readSampleData(byteBuffer, 0);
            Log.d(LOG_TAG, "SampleSize: " + sampleSize);
            if (sampleSize < 0) {
                //we're at end of file so submit EOS to the decoder input
                Log.d(LOG_TAG, "Video Decoder input EOS reached");
                codec.queueInputBuffer(index, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
            } else {
                long sampleTime = mediaExtractor.getSampleTime();
                codec.queueInputBuffer(index, 0, sampleSize, sampleTime, 0);
                mediaExtractor.advance();
            }
        }
    }

    @Override
    public void onOutputBufferAvailable(MediaCodec codec, int index, MediaCodec.BufferInfo info) {
        Log.d(LOG_TAG, "Rendering with preso time: " + info.presentationTimeUs);
        codec.releaseOutputBuffer(index, info.presentationTimeUs);
    }
};

}

The above tasks are kicked off from the main thread using

VideoDecoderTask decoderTask = new VideoDecoderTask(clipPath, new    Surface(surface));
        Thread thread = new Thread(decoderTask);
        thread.start();

where surface is from a TextureView

My educated guess from the message mediasync generates is that for some reason the buffers returned in onOutputBufferAvailable is not released back. What do I need to resolve this?

BTW I am running this on a Nexus9 running dev release M from https://developer.android.com/preview/download.html#images

Harkish
  • 2,262
  • 3
  • 22
  • 31

2 Answers2

1

I found one problem.

codec.releaseOutputBuffer(index, info.presentationTimeUs);

codec.releaseOutputBuffer(index, 1000 * info.presentationTimeUs);

Please refer the MediaSync example below.

https://github.com/skysign/MediaSyncExample

skysign
  • 1,084
  • 2
  • 12
  • 20
0

If I replace the output TextureView with a SurfaceView, Video playback seems to work fine.

Though this solves my problem, I do not understand why there is a difference between a surface created with a SurfaceTexture and a surface that is obtained from a SurfaceHolder.Per http://developer.android.com/reference/android/graphics/SurfaceTexture.html "A Surface created from a SurfaceTexture can be used as an output destination for the android.hardware.camera2, MediaCodec, MediaPlayer". This is pretty much what I am doing in my above example that uses MediaSync/MEdiaCodec and TextureView as the output surface

Harkish
  • 2,262
  • 3
  • 22
  • 31
  • Hi Hrakish, are you able to make it work? I am getting surface not created on line .MediaSync.setSurface – chikka.anddev Nov 04 '15 at 07:08
  • SurfaceView is synchronized with LCD's Vsync, until one Vsync is done, it don't return/render one video frame. – skysign Apr 21 '18 at 11:10
  • Unlike SurfaceView, TextureView render one video frame, however, it don't guarantee that one video frame is rendered, if subsequent video frame is fast before one Vsync is done, previous video frame is overwritten by subsequent video frame, and it is observed as video frame drop. – skysign Apr 21 '18 at 11:12