1

I'm doing live video processing via OpenGL, MediaCodec and MediaMuxer.

The output video seems to have frames dropped. The video seems to be running at 1 or 2 FPS, even though the app is running at 15 FPS. I've debugged the output of the encoder, and no frames are being dropped. What is going on?

I have added the core code below.

public class VideoSavingController
{
    // Static Variables
    private static final String MIME_TYPE = "video/avc";        
    private static final int FRAME_RATE = 15;                   
    private static final int IFRAME_INTERVAL = 1;               
    private static final int TIMEOUT_USEC = 10000;          

    private static final int BIT_RATE = 16 * 1000 * 1000;

    // Member Variables
    private boolean mIsRecordingStarted = false;
    private boolean mIsStartRequsted    = false;
    private boolean mIsStopRequested    = false;

    private MediaCodec mEncoder;
    private MediaCodec.BufferInfo mBufferInfo;

    private MediaMuxer mMuxer;
    private int mTrackIndex;
    private boolean mMuxerStarted = false;

    private VideoSavingSurface mInputSurface;

    private String mOutputPath;

    private long mStartTime;

    // Constructor
    public VideoSavingController(){}

    // Controls
    public void requestStartRecording()
    {
        mIsStartRequsted = true;
    }
    public void updateStartRecording()
    {
        if (mIsStartRequsted)
        {
            startRecording();
            mIsStartRequsted = false;
            mStartTime = SnapDat.camera().mCamera.timestamp();
        }
    }
    private void startRecording()
    {
        if (mIsRecordingStarted)
            return;
        mIsRecordingStarted = true;

        prepareEncoder();
    }
    public void recordFrameStep1()
    {
        if (!mIsRecordingStarted)
            return;

        mInputSurface.makeCurrent();

        drainEncoder(false);
    }
    /**
     * Draw the Image Between These Steps
     * Share texture between contexts by passing the GLSurfaceView's EGLContext as eglCreateContext()'s share_context argument
     * */
    public void recordFrameStep2()
    {
        if (!mIsRecordingStarted)
            return;

        // Set the presentation time stamp from the SurfaceTexture's time stamp.  This
        // will be used by MediaMuxer to set the PTS in the video.
        long time = SnapDat.camera().mCamera.timestamp() - mStartTime;
        mInputSurface.setPresentationTime( time );


        // Submit it to the encoder.  The eglSwapBuffers call will block if the input
        // is full, which would be bad if it stayed full until we dequeued an output
        // buffer (which we can't do, since we're stuck here).  So long as we fully drain
        // the encoder before supplying additional input, the system guarantees that we
        // can supply another frame without blocking.
        mInputSurface.swapBuffers();
    }
    public void requestStopRecording()
    {
        mIsStopRequested = true;
    }
    public void updateStopRecording()
    {
        if (mIsStopRequested)
        {
            mIsStopRequested = false;
            stopRecording();
        }
    }
    private void stopRecording()
    {
        if (!mIsRecordingStarted)
            return;
        mIsRecordingStarted = false;

        drainEncoder(true);
        releaseEncoder();

        // Notify Video File Added
        File videoFile = new File(mOutputPath);
        UtilityVideo.addVideo(videoFile, SnapDat.currentActivity());
    }
    public boolean isRecording()
    {
        return mIsRecordingStarted;
    }

    // Encoder
    private void prepareEncoder()
    {
        // Determine Size
        Size previewSize = xxxx
        int maxSize = Math.max(previewSize.width, previewSize.height);
        int width  = (640 * previewSize.width ) / maxSize;
        int height = (640 * previewSize.height) / maxSize;

        if ( !xxxx.isLandscape() )
        {
            int oldWidth = width;
            width = height;
            height = oldWidth;
        }

        // Force Factor of 16 Size
        width  = (width  / 16) * 16;
        height = (height / 16) * 16;

        mBufferInfo = new MediaCodec.BufferInfo();

        MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, width, height);
        format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
        format.setInteger(MediaFormat.KEY_BIT_RATE, BIT_RATE);
        format.setInteger(MediaFormat.KEY_FRAME_RATE, FRAME_RATE);
        format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, IFRAME_INTERVAL);

        mEncoder = MediaCodec.createEncoderByType(MIME_TYPE);
        mEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mInputSurface = new VideoSavingSurface( mEncoder.createInputSurface() );
        mEncoder.start();

        // Output filename
        mOutputPath = VideoSaver.getVideoPath();

        // Create a MediaMuxer.  We can't add the video track and start() the muxer here,
        // because our MediaFormat doesn't have the Magic Goodies.  These can only be
        // obtained from the encoder after it has started processing data.
        //
        // We're not actually interested in multiplexing audio.  We just want to convert
        // the raw H.264 elementary stream we get from MediaCodec into a .mp4 file.
        try
        {
            mMuxer = new MediaMuxer(mOutputPath, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
        } 
        catch (IOException ioe)
        {
            throw new RuntimeException("MediaMuxer creation failed", ioe);
        }

        mTrackIndex = -1;
        mMuxerStarted = false;
    }
    private void releaseEncoder()
    {
        if (mEncoder != null)
        {
            mEncoder.stop();
            mEncoder.release();
            mEncoder = null;
        }
        if (mInputSurface != null) 
        {
            mInputSurface.release();
            mInputSurface = null;
        }
        if (mMuxer != null) 
        {
            mMuxer.stop();
            mMuxer.release();
            mMuxer = null;
        }
    }
    private void drainEncoder(boolean endOfStream)
    {
        if (endOfStream)
            mEncoder.signalEndOfInputStream();  

        ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
        while (true) 
        {
            int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) 
            {
                break;
            } 
            else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) 
            {
                // not expected for an encoder
                encoderOutputBuffers = mEncoder.getOutputBuffers();
            } 
            else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) 
            {
                // should happen before receiving buffers, and should only happen once
                if (mMuxerStarted) 
                    throw new RuntimeException("format changed twice");
                MediaFormat newFormat = mEncoder.getOutputFormat();

                // now that we have the Magic Goodies, start the muxer
                mTrackIndex = mMuxer.addTrack(newFormat);
                mMuxer.start();
                mMuxerStarted = true;
            } 
            else if (encoderStatus < 0) 
            {
                // Unexpected status, ignore it
            } 
            else 
            {
                ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                if (encodedData == null) 
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus + " was null");

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0)
                    mBufferInfo.size = 0;

                if (mBufferInfo.size != 0)
                {
                    if (!mMuxerStarted) 
                        throw new RuntimeException("muxer hasn't started");

                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    encodedData.position(mBufferInfo.offset);
                    encodedData.limit(mBufferInfo.offset + mBufferInfo.size);

                    mMuxer.writeSampleData(mTrackIndex, encodedData, mBufferInfo);
                }

                mEncoder.releaseOutputBuffer(encoderStatus, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0)
                    break;      // out of while
            }
        }
    }
}

The code which drives this is given below:

    OpenGLState oldState = OpenGLState.createCurrent();

    mSaveVideo.updateStartRecording();
    if (mSaveVideo.isRecording())
    {
        mSaveVideo.recordFrameStep1();

        // Draws Image here

        mSaveVideo.recordFrameStep2();
    }
    mSaveVideo.updateStopRecording();

    oldState.makeCurrent();
user2820531
  • 143
  • 1
  • 10
  • Sorry, our collective crystal orbs are all broken, you'll have to post some source code of yours. Otherwise we can't even make guesses. – datenwolf May 09 '14 at 08:45
  • 1
    Are you setting the presentation time stamps through EGL? – fadden May 09 '14 at 14:38
  • 2
    From your comments, I can guess the following: The `timestamp`s coming from EGL may be coming as 1us, 2us, 3 us etc instead of 0, 66000, 132000, 199000 for 15 fps. You have confirmed that the output of encoder is not dropping frames. `MediaWriter` will not typically drop a frame and instead just write the content into file as long as `ts` are increasing. The observation you have quoted is most probably from a `decoder` which will drop frames based on `A/V Sync`. An easy to test this hypothesis is to dump the `ts` for samples from a tool like `Elecard StreamEye` or any other similar tool. – Ganesh May 12 '14 at 01:20
  • Yes fadden. I added the code to my answer. – user2820531 May 13 '14 at 01:12
  • @user2820531.. Is your issue solved? If so, what was the issue? – Ganesh May 13 '14 at 03:36
  • No, the issue isn't solved. Like the playback has 1/10th the frames present. Please explain the test in more detail. I don't know how to use that program at all. – user2820531 May 13 '14 at 06:27
  • If I sent you a video clip Ganesh, would you be able to make sense of what the problem is? – user2820531 May 13 '14 at 08:08
  • 1
    Could it be this? http://stackoverflow.com/questions/20386515/glsurfaceview-framerate-issues-on-nexus-5 – fadden May 13 '14 at 16:55
  • @fadden, yes that is the sort of thing I am getting! I'm looking into the question right now. Great how he says he thinks he was crazy. – user2820531 May 13 '14 at 17:44
  • ffs, if this works, i'm going to be pissed.... well actually i'm already pissed, i'll be so happy that it is finally working! – user2820531 May 13 '14 at 17:47
  • @fadden, I love you so much, can I give you a hug. I don't think I would have found this on my own. Well now I can keep working on my app! Been stuck on this for weeks. I'm going to call this one the $5,000 bug :). – user2820531 May 13 '14 at 17:54
  • Excellent. I've added it as an answer. – fadden May 13 '14 at 18:14

1 Answers1

1

This appears to be a bug in the driver when shared contexts are used.

This post has the details. In short, one of the contexts isn't noticing that the texture contents have changed, so it keeps rendering the previous data. You can work around the problem by binding to texture 0 and then back to the actual texture ID.

Community
  • 1
  • 1
fadden
  • 51,356
  • 5
  • 116
  • 166