2

I am trying to render videoFrames coming from the Android MediaCodec into a GLTexture. The video is playing, and it seems to work. However, the buffer seems to be messed up. (see image below)

        while (!Thread.interrupted()) {
            if (!isEOS) {
                int inIndex = decoder.dequeueInputBuffer(10000);
                if (inIndex >= 0) {
                    ByteBuffer buffer = inputBuffers[inIndex];
                    int sampleSize = extractor.readSampleData(buffer, 0);
                    if (sampleSize < 0) {
                    // We shouldn't stop the playback at this point, just pass the EOS
                    // flag to decoder, we will get it again from the
                    // dequeueOutputBuffer
                        Log.d("DecodeActivity", "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                        decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                        isEOS = true;
                    } else {
                        decoder.queueInputBuffer(inIndex, 0, sampleSize, extractor.getSampleTime(), 0);
                        extractor.advance();
                    }
                }
            }
            int outIndex = decoder.dequeueOutputBuffer(info, 10000);
            switch (outIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
                    outputBuffers = decoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
                    break;
                default:
                    ByteBuffer buffer = outputBuffers[outIndex];

                    Log.d(TAG, "Dimenstion output: " + videoHeight * videoWidth + " buffer size: " + info.size);

                    if (mImageWidth != videoWidth) {
                        mImageWidth = videoWidth;
                        mImageHeight = videoHeight;
                        adjustImageScaling();
                    }

                    buffer.position(info.offset);
                    buffer.limit(info.offset + info.size);

                    Log.d(TAG, "offset: " + info.offset + " size: " + info.size);

                    final byte[] ba = new byte[buffer.remaining()];
                    buffer.get(ba);

                    if (mGLRgbBuffer == null) {
                        mGLRgbBuffer = IntBuffer.allocate(videoHeight
                                * videoWidth);
                    }

                    if (mRunOnDraw.isEmpty()) {
                        runOnDraw(new Runnable() {
                            @Override
                            public void run() {
                                GPUImageNativeLibrary.YUVtoRBGA(ba, videoWidth,
                                        videoHeight, mGLRgbBuffer.array());
                                mGLTextureId = OpenGlUtils.loadTexture(mGLRgbBuffer,
                                        videoWidth, videoHeight, mGLTextureId);

                            }
                        });
                    }

                    Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);
                    // We use a very simple clock to keep the video FPS, or the video
                    // playback will be too fast
                    while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) {
                        try {
                            sleep(10);
                        } catch (InterruptedException e) {
                            e.printStackTrace();
                            break;
                        }
                    }
                    decoder.releaseOutputBuffer(outIndex, true);
                    break;
            }
            // All decoded frames have been rendered, we can stop playing now
            if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
                break;
            }
        }

Here's a screenshot of the GLSurfaceView it gets rendered to: pic.twitter.com/pnNNiqqAsk

I've found this answer: Media Codec and Rendering using GLSurfaceview, Optimization for OnDrawFrame But none of the solutions seem to work.

Community
  • 1
  • 1
Martin Schüller
  • 904
  • 1
  • 9
  • 7
  • I would guess your YUV to RGB conversion isn't right. The fact that you aren't passing the color format as an argument to your conversion function informs that guess. Any particular reason you can't decode to a SurfaceTexture? (Various examples in https://github.com/google/grafika .) If you're on a qualcomm device, see http://stackoverflow.com/questions/10059738/ – fadden Mar 03 '15 at 05:17
  • Thanks for you help, fadden! I will try that. I know that the color format of the video is MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar. However, I don't know what exactly the conversion function in the code is doing. Why I can't render to a SurfaceTexture: I am applying fragment shaders to the video. For that, I need a GLSurfaceTexture. Overall I am trying to make Video work with GPUImage for Android and am almost there ... – Martin Schüller Mar 03 '15 at 20:31
  • A SurfaceTexture converts a graphic buffer, such as a frame from a decoded video, into a GLES texture. That appears to be exactly what you're trying to do here, but you're doing it in software, which is much slower and more difficult to do portably. The fragment shader is applied when the texture is rendered to a surface. Note that a SurfaceTexture is very different from a SurfaceView, which is used to display stuff. See e.g. https://www.youtube.com/watch?v=kH9kCP2T5Gg for an example of filtering live video from the camera with a fragment shader. – fadden Mar 03 '15 at 21:01
  • Sorry, I mixed up wording there. I tried using a SurfaceTexture before. To do so, I needed to use the GL_OES_EGL_image_external texture target. Using GL_OES_EGL_image_external caused a couple of other problems, such as glReadPixels not working anymore and other, GPUImage related problems. That's why I am trying to follow the convention used in the GPUImage onPreviewFrame function (https://github.com/CyberAgent/android-gpuimage/blob/master/library/src/jp/co/cyberagent/android/gpuimage/GPUImageRenderer.java#L132). Converting it before feeding it to the Texture. – Martin Schüller Mar 03 '15 at 22:48
  • The problem doesn't seem to be the conversion of the color. The ByteArray ba contains negative values, which already seems wrong ... Investigating more – Martin Schüller Mar 05 '15 at 02:38
  • Java `byte` values are signed [-128,127]. Convert to [0,255] with e.g. `int val = ba[i] & 0xff`. – fadden Mar 05 '15 at 05:44

0 Answers0