3

I wish to display 2 video views side by side. I do not wish to implement custom MediaController because the default is very good, but, no matter what I did I could not control 2 videos simultaneously.

    val mediaController = MediaController(requireContext())
    mediaController.setAnchorView(videoViewF)
    videoViewF.setMediaController(mediaController)
    videoViewR.setMediaController(mediaController)

How can I achieve this? Can I get callback from MediaController or the first VideoView when progress is changed/paused/played? Or some other way?

Paulo Boaventura
  • 1,365
  • 1
  • 9
  • 29
Dim
  • 4,527
  • 15
  • 80
  • 139
  • 1
    Perhaps one option is to subclass the existing MediaController so that it internally has 2 MediaController objects (each associated with a different video view), then route each command for the master controller to both of the internal Mediacontrollers. It would only work if MediaController has no static dependencies, so check that first. – StarShine Dec 27 '20 at 13:16
  • 1
    This is a good idea. But it more complicated than build custom media controller I presume – Dim Dec 28 '20 at 15:15
  • The master controller object would anyway pretty quickly run into a scenario where it depends on the state of the child controllers, hence you'd effectively be writing a custom controller implementation. The only 'win' would be the default behavior still working 'as expected'. – StarShine Dec 28 '20 at 15:36
  • Hi @Dim do any of the answers fit your project? help you? I await feedback – Paulo Boaventura Jan 01 '21 at 00:58

2 Answers2

2

Dual VideoView Playing 3gp Videos In Android

This example will explain how to include 2 Video View in layout to play different 3gp from at the same time.

Algorithm:

1.) Create a new project by File-> New -> Android Project name it DualVideoViewExample.

2.) Write following into main.xml:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="vertical" >

<TextView
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:text="Dual VideoView" />
<LinearLayout
android:orientation="vertical"
android:layout_width="fill_parent"
android:layout_height="match_parent">
<VideoView
android:id="@+id/myvideoview"
android:layout_width="fill_parent"
android:layout_height="wrap_content" />
<VideoView
android:id="@+id/myvideoview2"
android:layout_width="fill_parent"
android:layout_height="wrap_content" />
</LinearLayout>
</LinearLayout>

3.) Put two 3gp video files in res/raw folder. 4.) Run for output.

Steps:

1.) Create a project named DualVideoViewExample and set the information as stated in the image.

Build Target: Android 4.4 Application Name: DualVideoViewExample Package Name: com.example.DualVideoViewExample Activity Name: DualVideoViewExampleActivity

dualvideoview1

2.) Open DualVideoViewExampleActivity.java file and write following code there:

package com.example.dualvideoviewexample;

import android.app.Activity;
import android.net.Uri;
import android.os.Bundle;
import android.widget.MediaController;
import android.widget.VideoView;

public class DualVideoViewExampleActivity extends Activity {

/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
VideoView myVideoView = (VideoView)findViewById(R.id.myvideoview);
//myVideoView.setVideoURI(Uri.parse(SrcPath));
myVideoView.setVideoURI(Uri.parse("android.resource://" + getPackageName()     +"/"+R.raw.junglebook));
myVideoView.setMediaController(new MediaController(this));
myVideoView.requestFocus();
myVideoView.start();

VideoView myVideoView2 = (VideoView)findViewById(R.id.myvideoview2);
myVideoView2.setVideoURI(Uri.parse("android.resource://" + getPackageName() +"/"+R.raw.ringaroses));
myVideoView2.setMediaController(new MediaController(this));
myVideoView2.requestFocus();
myVideoView2.start();
}
}
 

3.) Compile and build the project.

NOTE : You can stream and play your 3gp direct from internet also instead of getting it from raw folder as shown in example.

Output

dualvideoview2

Paulo Boaventura
  • 1,365
  • 1
  • 9
  • 29
2

You are not giving an awful lot of specifics on what exactly you have tried and what the problematic areas are, so I just made a small test to see if I could reproduce any of what you're describing.

I do not have any conclusive findings, but can at least confirm that my Galaxy Nexus (Android 4.0.2) is able to play three videos simultaneously without any problems. On the other hand, an old Samsung Galaxy Spica (Android 2.1-update1) I had lying around only plays a single file at a time - it appears to always be the first SurfaceView.

I further investigated different API levels by setting up emulators for Android 3.0, 2.3.3, and 2.2. All these platforms appear to be able to handle playback of multiple video files onto different surface views just fine. I did one final test with an emulator running 2.1-update1 too, which interestingly also played the test case without problems, unlike the actual phone. I did notice some slight differences in how the layout was rendered though.

This behaviour leads me to suspect that there's not really any software limitation to what you're after, but it seems to depend on the hardware wether simultaneous playback of multiple video files is supported. Hence the support for this scenario will differ per device. From an emperical point of view, I definitely think it would be interesting to test this hypotheses on some more physical devices.

Just for reference some details with regards to the implementation:

I set up two slightly different implementations: one based on three MediaPlayer instances in a single Activity, and one in which these were factored out into three separate fragments with each their own MediaPlayer object. (I did not find any playback differences for these two implementations by the way) A single 3gp file (thanks for that, Apple), located in the assets folder, was used for playback with all players. The code for both implementations is attached below and largely based on Googles MediaPlayerDemo_Video sample implementation - I did strip away some code not required for the actual testing. The result is by no means complete or suitable for using in live apps.

Activity-based implementation:

public class MultipleVideoPlayActivity extends Activity implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

private static final String TAG = "MediaPlayer";
private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };

private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_layout);

    // create surface holders
    for (int i=0; i<mSurfaceViews.length; i++) {
        mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
        mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
        mSurfaceHolders[i].addCallback(this);
        mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }
}

public void onBufferingUpdate(MediaPlayer player, int percent) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
}

public void onCompletion(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
}

public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
    if (width == 0 || height == 0) {
        Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
        return;
    }

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mSizeKnown[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void onPrepared(MediaPlayer player) {
    Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");

    int index = indexOf(player);
    if (index == -1) return; // sanity check; should never happen
    mVideoReady[index] = true;
    if (mVideoReady[index] && mSizeKnown[index]) {
        startVideoPlayback(player);
    }
}

public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
}

public void surfaceDestroyed(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
}


public void surfaceCreated(SurfaceHolder holder) {
    Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");

    int index = indexOf(holder);
    if (index == -1) return; // sanity check; should never happen
    try { 
        mMediaPlayers[index] = new MediaPlayer();
        AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
        mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
        mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
        mMediaPlayers[index].prepare();
        mMediaPlayers[index].setOnBufferingUpdateListener(this);
        mMediaPlayers[index].setOnCompletionListener(this);
        mMediaPlayers[index].setOnPreparedListener(this);
        mMediaPlayers[index].setOnVideoSizeChangedListener(this);
        mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
    }
    catch (Exception e) { e.printStackTrace(); }
}

@Override protected void onPause() {
    super.onPause();
    releaseMediaPlayers();
}

@Override protected void onDestroy() {
    super.onDestroy();
    releaseMediaPlayers();
}

private void releaseMediaPlayers() {
    for (int i=0; i<mMediaPlayers.length; i++) {
        if (mMediaPlayers[i] != null) {
            mMediaPlayers[i].release();
            mMediaPlayers[i] = null;
        }
    }
}


private void startVideoPlayback(MediaPlayer player) {
    Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
    player.start();
}

private int indexOf(MediaPlayer player) {
    for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
    return -1;  
}

private int indexOf(SurfaceHolder holder) {
    for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
    return -1;  
}
}

R.layout.multi_videos_layout:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<SurfaceView android:id="@+id/video_1_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_2_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

<SurfaceView android:id="@+id/video_3_surfaceview"
    android:layout_width="fill_parent" android:layout_height="0dp"
    android:layout_weight="1" />

 </LinearLayout>

Fragment-based implementation:

public class MultipleVideoPlayFragmentActivity extends FragmentActivity {

private static final String TAG = "MediaPlayer";

@Override public void onCreate(Bundle icicle) {
    super.onCreate(icicle);
    setContentView(R.layout.multi_videos_activity_layout);
}

public static class VideoFragment extends Fragment implements
    OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {

    private MediaPlayer mMediaPlayer;
    private SurfaceView mSurfaceView;
    private SurfaceHolder mSurfaceHolder;
    private boolean mSizeKnown;
    private boolean mVideoReady;

    @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
        return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
    }

    @Override public void onActivityCreated(Bundle savedInstanceState) {
        super.onActivityCreated(savedInstanceState);
        mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
        mSurfaceHolder = mSurfaceView.getHolder();
        mSurfaceHolder.addCallback(this);
        mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    public void onBufferingUpdate(MediaPlayer player, int percent) {
        Log.d(TAG, "onBufferingUpdate percent: " + percent);
    }

    public void onCompletion(MediaPlayer player) {
        Log.d(TAG, "onCompletion called");
    }

    public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
        Log.v(TAG, "onVideoSizeChanged called");
        if (width == 0 || height == 0) {
            Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
            return;
        }

        mSizeKnown = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void onPrepared(MediaPlayer player) {
        Log.d(TAG, "onPrepared called");

        mVideoReady = true;
        if (mVideoReady && mSizeKnown) {
            startVideoPlayback();
        }
    }

    public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
        Log.d(TAG, "surfaceChanged called");
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "surfaceDestroyed called");
    }

    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "surfaceCreated called");

        try { 
            mMediaPlayer = new MediaPlayer();
            AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
            mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
            mMediaPlayer.setDisplay(mSurfaceHolder);
            mMediaPlayer.prepare();
            mMediaPlayer.setOnBufferingUpdateListener(this);
            mMediaPlayer.setOnCompletionListener(this);
            mMediaPlayer.setOnPreparedListener(this);
            mMediaPlayer.setOnVideoSizeChangedListener(this);
            mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
        }
        catch (Exception e) { e.printStackTrace(); }
    }

    @Override public void onPause() {
        super.onPause();
        releaseMediaPlayer();
    }

    @Override public void onDestroy() {
        super.onDestroy();
        releaseMediaPlayer();
    }

    private void releaseMediaPlayer() {
        if (mMediaPlayer != null) {
            mMediaPlayer.release();
            mMediaPlayer = null;
        }
    }

    private void startVideoPlayback() {
        Log.v(TAG, "startVideoPlayback");
        mMediaPlayer.start();
    }
}
}

R.layout.multi_videos_activity_layout:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
    android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
    android:layout_height="0dp" android:layout_weight="1" />

</LinearLayout>

R.layout.multi_videos_fragment_layout:

<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
android:layout_height="fill_parent" />

Update: Although it's been around for a while now, I just thought it'd be worth pointing out that Google's Grafika project showcases a 'double decode' feature, which "Decodes two video streams simultaneously to two TextureViews.". Not sure how well it scales to more than two video files, but nevertheless relevant for the original question.

Paulo Boaventura
  • 1,365
  • 1
  • 9
  • 29