1

This is a follow-up question of this question.

This is my TextureView code:

public class VideoTextureView extends TextureView implements SurfaceTextureListener{

    private static final String LOG_TAG = VideoTextureView.class.getSimpleName();
    private MediaCodecDecoder mMediaDecoder;
    private MediaCodecAsyncDecoder mMediaAsyncDecoder;

    public VideoTextureView(Context context, AttributeSet attrs) {
        super(context, attrs);
        setSurfaceTextureListener(this);
        Log.d(LOG_TAG, "Video texture created.");
    }

    @Override
    public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
        Log.d(LOG_TAG, "Surface Available: " + width + " x " + height);
        mMediaDecoder = new MediaCodecDecoder();
        mMediaDecoder.Start();
        mMediaDecoder.SetSurface(new Surface(getSurfaceTexture()));
    }

    @Override
    public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
        // TODO Auto-generated method stub

    }

    @Override
    public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
        mMediaDecoder.Stop();
        return false;
    }

    @Override
    public void onSurfaceTextureUpdated(SurfaceTexture surface) {
        // TODO Auto-generated method stub

    }

}

My question - Is my TextureView implementation okay for rendering H264 streams decoded by MediaCodec? Or do I need to do EGL setup or anything else?

Thanks in advance!

Community
  • 1
  • 1
Kaidul
  • 15,409
  • 15
  • 81
  • 150
  • 1
    EGL setup is only required if you're rendering with GLES. TextureView combines a SurfaceTexture with a custom View, and does the GLES rendering for you. Which is why the View *must* be hardware-accelerated for TextureView to work. – fadden Sep 29 '15 at 17:54
  • @fadden Thank you! So, I don't need to setup EGL context :) In morning, I tried with `SurfaceView` and set `Surface` using `SurfaceHolder.getSurface()` and found same result. So I think `TextureView` related code in not the problem may be. "Which is why the View must be hardware-accelerated for TextureView to work" - Android developer website stated it too. Does this mean - I must need to set hardware acceleration to `true` in view level of manifest file? – Kaidul Sep 29 '15 at 18:00
  • 1
    I think it's enabled by default on recent versions of Android, but you may need to enable it explicitly for older devices. See http://developer.android.com/guide/topics/graphics/hardware-accel.html – fadden Sep 29 '15 at 20:10
  • @fadden Okay! So I think I am using H/W acceleration. Do you have any suggestions for the other thread http://stackoverflow.com/questions/32723393 ? I will appreciate your thoughts on this as I am really struggling with that issue. My encoded stream (ffmpeg) might be incompatible with MediaCodec decoder and streams encoded by MediaCodec might work properly with my decoder. Is there any probable bug in my implementation? – Kaidul Sep 30 '15 at 03:24
  • 1
    @KrzysztofKansy found that streams encoded from the Camera + MediaCodec played back without problems, so there's something different about his streams. You may be facing the same situation. I don't know enough about the details of H.264 to speculate on possibilities. – fadden Sep 30 '15 at 05:15
  • @fadden Thank you! So I can assume my implementation seems apparently okay? – Kaidul Oct 06 '15 at 04:42

2 Answers2

1

My TextureView implementation is okay as I tried with SurfaceView too and found same result. And as @fadden said -

EGL setup is only required if you're rendering with GLES. TextureView combines a SurfaceTexture with a custom View, and does the GLES rendering for you. Which is why the View must be hardware-accelerated for TextureView to work.

Thanks to @fadden.

Kaidul
  • 15,409
  • 15
  • 81
  • 150
0

I am currently using a TextureView for rendering multiple streams in one activity using collection view cells on android (Sorry for ios terminology there).

It works fine but the issue is when you for example rotate the device there will be surface_destroyed followed by surface_available. As i see you are correctly stopping and starting your decoder.

One thing i do in my Decoder is:

List<NaluSegment> segments = NaluParser.parseNaluSegments(buffer);
        for (NaluSegment segment : segments) {
            // ignore unspecified NAL units.
            if (segment.getType() != NaluType.UNSPECIFIED) {

                // Hold the parameter set for stop/start initialization speed
                if (segment.getType() == NaluType.PPS) {
                    lastParameterSet[0] = segment;
                } else if (segment.getType() == NaluType.SPS) {
                    lastParameterSet[1] = segment;
                } else if (segment.getType() == NaluType.CODED_SLICE_IDR) {
                    lastParameterSet[2] = segment;
                }

                // add to input queue
                naluSegmentQueue.add(segment);
            }
        }

I hold onto the last parameter sets and last keyframe and on start i fill the naluSegmentQueue with these first off to reduce the delay in video rendering.

redbrain
  • 162
  • 2
  • 11