0

We managed to create a working Miracast sink using UWP and we wanted to use this functionality inside a .NET Core application. So we followed this guide to use UWP apis in a .NET Core project:

Using UWP apis with WPF and .NET Core

The project runs, we get a connection from a Smartphone to the application but then we don't receive any video frame from the MediaPlayer object (Unlike on the original UWP project in which is working correctly)

We observed that in the MediaSource object we obtain a mcrecv url (Example -> mcrecv://192.168.137.247:7236/h-0000000c/192.168.137.1)

But then the MediaPlayer consuming it doesn't fire any VideoFrameAvailable event.

How can we solve this? Following is the basic implementation we used:

using System;
using System.Diagnostics;
using System.Windows;
using Windows.Graphics.Imaging;
using Windows.Media.Miracast;
using Windows.Media.Playback;

namespace Miracast_GUI
{
    /// <summary>
    /// Interaction logic for MainWindow.xaml
    /// </summary>
    public partial class MainWindow : Window
    {
        public MiracastReceiver receiver;
        public MiracastReceiverSession session;
        public MediaPlayer mp;
        public SoftwareBitmap frameServerDest;
        public MiracastReceiverConnection connection;

        public MainWindow()
        {
            InitializeComponent();
            // Starts service
            StartMiracastService();
        }

        public void StartMiracastService()
        {
            receiver = new MiracastReceiver();
            receiver.StatusChanged += Receiver_StatusChanged;
            MiracastReceiverSettings settings = receiver.GetDefaultSettings();

            settings.FriendlyName = "Miracast-Service-Test";
            settings.AuthorizationMethod = MiracastReceiverAuthorizationMethod.None;
            settings.ModelName = receiver.GetDefaultSettings().ModelName;
            settings.ModelNumber = receiver.GetDefaultSettings().ModelNumber;
            settings.RequireAuthorizationFromKnownTransmitters = receiver.GetDefaultSettings().RequireAuthorizationFromKnownTransmitters;

            receiver.DisconnectAllAndApplySettings(settings);

            session = receiver.CreateSession(/*CoreApplication.MainView*/ null);
            session.AllowConnectionTakeover = true;

            session.ConnectionCreated += Session_ConnectionCreated;
            session.MediaSourceCreated += Session_MediaSourceCreated;
            session.Disconnected += Session_Disconnected;

            MiracastReceiverSessionStartResult result = session.Start();
            Debug.WriteLine("Status: " + result.Status);
        }

        private void Session_Disconnected(MiracastReceiverSession sender, MiracastReceiverDisconnectedEventArgs args)
        {
            session.Dispose();
        }

        private void Receiver_StatusChanged(MiracastReceiver sender, object args)
        {
            Debug.WriteLine(receiver.GetStatus().ListeningStatus);
        }

        private void Session_ConnectionCreated(MiracastReceiverSession sender, MiracastReceiverConnectionCreatedEventArgs args)
        {
            connection = args.Connection;
            connection.InputDevices.Keyboard.TransmitInput = true;
            connection.InputDevices.GameController.Mode =
                MiracastReceiverGameControllerDeviceUsageMode.AsMouseAndKeyboard;

            Debug.WriteLine("CONNECTION CREATED");
        }

        private void Session_MediaSourceCreated(MiracastReceiverSession sender, MiracastReceiverMediaSourceCreatedEventArgs args)
        {
            mp = new MediaPlayer
            {
                IsVideoFrameServerEnabled = true,
                AutoPlay = true,
                Source = args.MediaSource,
                RealTimePlayback = true
            };

            mp.VideoFrameAvailable += Mp_VideoFrameAvailable;
            Debug.WriteLine(mp.PlaybackSession.PlaybackState);
            mp.Play();

            Debug.WriteLine("MEDIA SOURCE CREATED");
        }

        private void Mp_VideoFrameAvailable(MediaPlayer sender, object args)
        {
            Console.WriteLine("Received frame...");
        }
    }
}
  • Hi @HansPassant thanks for your detailed reply. The miracast receiver object is already correctly firing the StatusChanged event; our problem is that VideoFrameAvailable event attached to the Mediaplayer object is not firing when the connection is created. We apologize if we didn't understand your proposed solution, in this case can you please explain it us a bit further ? Thank you – Cristian Carli Jan 24 '20 at 09:37

1 Answers1

0

UWP App's MediaPlayer does not fire the VideoFrameAvailable event if the "Internet (Client & Server)" capability is not using. Perhaps a WPF project has not any capability? Thus, the wpf app cannot use other services that require this capability. I do not know how to use uwp capabilities in wpf.

gbor
  • 16
  • 2
  • Welcome to Stack Overflow. Your answer looks more like a question. If this is the case, please delete your answer. – Roar S. Sep 02 '20 at 13:50