0

I am investigating how to make an Android Exoplayer play a single content stream to multiple surfaces. For example, if the content stream is online, it would only be downloaded once, yet still be played on both surfaces.

I have investigated this topic, and can share what I have learned so far. Typically, it is not necessary to actually use multiple surfaces, since an OpenGL shader can be used to make a "split screen effect" where a single surface appears to play multiple videos. I actually am using OpenGL shaders for both of the surfaces already (they are in GLSurfaceViews), but using a single surface does not seem to be an option, since one of the surfaces is forced to a low resolution using .setFixedSize() and the other is not.

In a more low-level approach, I investigated whether it is possible to set multiple surfaces for the MediaCodec wrapped in DemoPlayer's MediaCodecVideoTrackRenderer class. However, it seems that a MediaCodec has been designed to only be configured with one surface, so this approach does not seem effective.

Another approach I have considered is to somehow share the output buffer of one player's MediaCodec with the other players, though I am not sure whether this is practical or feasible.

Any ideas or guidance on how to play a single stream on multiple surfaces would be greatly appreciated. Thank you.

M.S.
  • 1,091
  • 1
  • 11
  • 27
  • 2
    Can you send it to a SurfaceTexture and render the texture multiple times? Sounds like you're already sort of doing that, so I'm not sure what's missing. – fadden Mar 29 '16 at 18:25
  • Never thought of that approach, but it seems effective. Thanks Fadden! – M.S. Mar 30 '16 at 02:39
  • it will be cool if you can make a sample project with this ;) – Hugo Gresse Mar 31 '16 at 08:44
  • Could you explain how to do fadden's suggestion with GLSurfaceView? I'm trying to do this myself. but I can't quite figure it out. – Cameron Aug 24 '16 at 20:17
  • @Cameron The key idea is instead of having two video streams, you can have one. Then, in your rendering loop, you would switch between output surfaces for your SurfaceTexture. I used Google's Grafika library (https://github.com/google/grafika/) as a good starting point. There are a few projects that demonstrate how to render one SurfaceTexture to multiple surfaces. Feel free to lift the entire grakifa/gles folder (it contains classes simplifying low-level OpenGL calls if you are not familiar with them). – M.S. Aug 26 '16 at 02:30
  • Thanks @Michael, that's what I ended up doing :) – Cameron Aug 26 '16 at 19:35

2 Answers2

1

Answers here have helped me in my research on this topic, so here's a complete example with a single PlayerView, and a SurfaceView to which the video is also being played. Based on ContinuousCaptureActivity example from grafika.

Here WindowSurfaces (mainDisplaySurface and secondaryDisplaySurface) share the same egl context (eglCore). When a frame is available, we draw the frame to both by switching between them.

Example uses classes from gles folder

activity_main.xml

<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    tools:context=".MainActivity">

    <com.google.android.exoplayer2.ui.PlayerView
        android:id="@+id/playerView"
        android:layout_width="match_parent"
        android:layout_height="0dp"
        app:layout_constraintDimensionRatio="16:9"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintEnd_toEndOf="parent"
        app:layout_constraintTop_toTopOf="parent" />

    <SurfaceView
        android:id="@+id/surfaceView"
        android:layout_width="160dp"
        android:layout_height="90dp"
        app:layout_constraintTop_toBottomOf="@id/playerView"
        app:layout_constraintBottom_toBottomOf="parent"
        app:layout_constraintStart_toStartOf="parent"
        app:layout_constraintEnd_toEndOf="parent" />

</androidx.constraintlayout.widget.ConstraintLayout>

MainActivity.kt

class MainActivity : AppCompatActivity(), SurfaceTexture.OnFrameAvailableListener {

    private lateinit var playerView: PlayerView
    private lateinit var surfaceView: SurfaceView

    private lateinit var player: SimpleExoPlayer

    private var eglCore: EglCore? = null
    private var fullFrameBlit: FullFrameRect? = null
    private var textureId: Int = 0
    private var videoSurfaceTexture: SurfaceTexture? = null
    private val transformMatrix = FloatArray(16)

    private var mainDisplaySurface: WindowSurface? = null
    private var secondaryDisplaySurface: WindowSurface? = null

    private var surface: Surface? = null

    private val surfaceViewHolderCallback = object : SurfaceHolder.Callback {
        override fun surfaceCreated(holder: SurfaceHolder) {
            secondaryDisplaySurface = WindowSurface(eglCore, holder.surface, false)
        }

        override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {}
        override fun surfaceDestroyed(holder: SurfaceHolder) {}
    }

    private val playerViewHolderCallback = object : SurfaceHolder.Callback {
        override fun surfaceCreated(holder: SurfaceHolder) {
            eglCore = EglCore()

            mainDisplaySurface = WindowSurface(eglCore, holder.surface, false).apply {
                makeCurrent()
            }
            fullFrameBlit = FullFrameRect(Texture2dProgram(Texture2dProgram.ProgramType.TEXTURE_EXT))
            textureId = fullFrameBlit!!.createTextureObject()
            videoSurfaceTexture = SurfaceTexture(textureId).also {
                it.setOnFrameAvailableListener(this@MainActivity)
            }

            surface = Surface(videoSurfaceTexture)

            player.setVideoSurface(surface)
        }

        override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {}
        override fun surfaceDestroyed(holder: SurfaceHolder) {}
    }

    override fun onCreate(savedInstanceState: Bundle?) {
        super.onCreate(savedInstanceState)
        setContentView(R.layout.activity_main)

        playerView = findViewById(R.id.playerView)
        surfaceView = findViewById(R.id.surfaceView)

        player = SimpleExoPlayer.Builder(this).build().apply {
            setMediaItem(MediaItem.fromUri("file:///android_asset/video.mp4"))
            playWhenReady = true
        }

        playerView.player = player

        (playerView.videoSurfaceView as SurfaceView).holder.addCallback(playerViewHolderCallback)
        surfaceView.holder.addCallback(surfaceViewHolderCallback)

        player.prepare()
    }

    override fun onFrameAvailable(surfaceTexture: SurfaceTexture?) {
        if (eglCore == null) return

        // PlayerView
        mainDisplaySurface?.let {
            drawFrame(it, playerView.width, playerView.height)
        }

        // SurfaceView
        secondaryDisplaySurface?.let {
            drawFrame(it, surfaceView.width, surfaceView.height)
        }
    }

    private fun drawFrame(windowSurface: WindowSurface, viewWidth: Int, viewHeight: Int) {
        windowSurface.makeCurrent()

        videoSurfaceTexture!!.apply {
            updateTexImage()
            getTransformMatrix(transformMatrix)
        }

        GLES20.glViewport(0, 0, viewWidth, viewHeight)

        fullFrameBlit!!.drawFrame(textureId, transformMatrix)

        windowSurface.swapBuffers()
    }

    override fun onPause() {
        surface?.release()
        surface = null

        videoSurfaceTexture?.release()
        videoSurfaceTexture = null

        mainDisplaySurface?.release()
        mainDisplaySurface = null

        secondaryDisplaySurface?.release()
        secondaryDisplaySurface = null

        fullFrameBlit?.release(false)
        fullFrameBlit = null

        eglCore?.release()
        eglCore = null

        super.onPause()
    }
}
miredirex
  • 310
  • 3
  • 10
0

This would be something I would also want to do, but I have located several problem areas.

1. A Surface passed to a MediaCodecVideoTrackRenderer is simply passed on to a MediaCodec for "configuration".
The stock android code for what happens when a Surface is passed to a video renderer is the following:

protected void configureCodec(MediaCodec codec, boolean codecIsAdaptive, android.media.MediaFormat format, MediaCrypto crypto) 
{
    maybeSetMaxInputSize(format, codecIsAdaptive);
    codec.configure(format, surface, crypto, 0); //<-- Surface use
    codec.setVideoScalingMode(videoScalingMode);
}

It's used for a couple of other checks, to see if the codec should be initialised, but nothing more in the actual TrackRenderer.
So not much that can be done there.

2. How the MediaCodec handles an Output Surface.
Reading through the MediaCodec docs, I have found several issues with "sharing" the output buffer. According to the documentation of the MediaCodec (Documentation):

Using an Output Surface
The data processing is nearly identical to the ByteBuffer mode when using an output Surface; however, the output buffers will not be accessible, and are represented as null values. E.g. getOutputBuffer/Image(int) will return null and getOutputBuffers() will return an array containing only null-s.

This means that if a single Surface is used as output, no other functions can have access to the output buffer.


Possible Solution
The 'simplest' solution to this could be to have a "decoding" Video renderer and a single ExoPlayer to control the playback of the different Surfaces.
(Don't worry, it's not as crazy at it sounds.)

Step 1: Create a TrackRenderer that simply outputs a (video) buffer.
Step 2: Get that TrackRenderer to do all the work of retrieving and decoding the video input, which then is sent to a ByteBuffers, that can then be retrieved from the TrackRenderer. (Either continuously or after complete retrieval and decoding of the input.)
Step 3: Make as many copies of the ByteBuffers as necessary (2 for two Surfaces, 3 for three Surfaces, etc).
Step 4: Feed each, individual and separate ByteBuffers into the desired Surface (as Surfaces uses native video buffers).

I am still unsure of the way ExoPlayer can be used to control the playback of all the different Surfaces, but an idea could be to "control" the bytes passed to the Surfaces from the decoded ByteBuffers.

Bam
  • 478
  • 6
  • 19
  • 1
    I solved this using fadden's suggested method in the comments above. One ExoPlayer rendering to a SurfaceTexture is used, after which we can render to multiple surfaces sharing the same EGL context. – M.S. May 14 '16 at 14:16
  • Can you give me more detail about how you solved it, @Michael. Give me an example code is better. LOL – Slim_user71169 Nov 01 '19 at 08:21