0

I just started to experiment with MediaStreamSource in UWP. I took the MediaStreamSource streaming example from MS and tried to rewrite it to support mp4 instead of mp3. I changed nothing but the InitializeMediaStreamSource part, it now looks like this:

{
    var clip = await MediaClip.CreateFromFileAsync(inputMP3File);
    var audioTrack = clip.EmbeddedAudioTracks.First();
    var property = clip.GetVideoEncodingProperties();

    // initialize Parsing Variables
    byteOffset = 0;
    timeOffset = new TimeSpan(0);

    var videoDescriptor = new VideoStreamDescriptor(property);
    var audioDescriptor = new AudioStreamDescriptor(audioTrack.GetAudioEncodingProperties());

    MSS = new MediaStreamSource(videoDescriptor)
    {
        Duration = clip.OriginalDuration
    };

    // hooking up the MediaStreamSource event handlers
    MSS.Starting += MSS_Starting;
    MSS.SampleRequested += MSS_SampleRequested;
    MSS.Closed += MSS_Closed;

    media.SetMediaStreamSource(MSS);
}    

My problem is, that I cannot find a single example where video streams are used instead of audio, so I can't figure out what's wrong with my code. If I set the MediaElement's Source property to the given mp4 file, it works like a charm. If I pick an mp3 and leave the videoDescriptor out then as well. But if I try to do the same with a video (I'm still not sure whether I should add the audioDescriptor as a second arg to the MediaStreamSource or not, but because I've got one mixed stream, I guess it's not needed), then nothing happens. The SampleRequested event is triggered. No error is thrown. It's really hard to debug it, it's a real pain in the ass. :S

Zsolt Bangha
  • 1
  • 1
  • 1
  • This feels like a really obvious answer and I think I'm missing something here. The MediaStreamSource needs a descriptor for each stream. The overloaded constructor takes two separate descriptors just for this purpose. In other words you should pass both the audio and video descriptors into the constructor is your intent is to stream both audio and video to the downstream decoders. – James Dailey - MSFT Jan 12 '18 at 23:53
  • The problem is that I only have ONE data stream, but if I define two separate descriptors then the SampleRequested event is called twice, once for the video and once for the audio stream. Btw now I solved the issue simply not using a MediaStreamSource. I use the SetSource method of the MediaElement instance which accepts streams as well, not just URIs. – Zsolt Bangha Jan 16 '18 at 08:50

1 Answers1

0

I have solution to build working video MediaStreamSource from file bitmaps but unfortunately have not found solution for RGBA buffer. First of all read MediaStreamSource Class documentation https://learn.microsoft.com/en-us/uwp/api/windows.media.core.mediastreamsource I'm creating MJPEG MediaStreamSource

var MediaStreamSource = new MediaStreamSource(
                new VideoStreamDescriptor(
                    VideoEncodingProperties.CreateUncompressed(
                        CodecSubtypes.VideoFormatMjpg, size.Width, size.Height
                    )
                )
            );

Then initialize some buffer time

MediaStreamSource.BufferTime = TimeSpan.FromSeconds(1);

Then subscribe for event to set requested frame.

MediaStreamSource.SampleRequested += async (MediaStreamSource sender, MediaStreamSourceSampleRequestedEventArgs args) =>
            {
                var deferal = args.Request.GetDeferral();
                try
                {
                    var timestamp = DateTime.Now - startedAt;

                    var file = await Windows.ApplicationModel.Package.Current.InstalledLocation.GetFileAsync(@"Assets\grpPC1.jpg");
                    using (var stream = await file.OpenReadAsync())
                    {
                        args.Request.Sample = await MediaStreamSample.CreateFromStreamAsync(
                            stream.GetInputStreamAt(0), (uint)stream.Size, timestamp);
                    }
                    args.Request.Sample.Duration = TimeSpan.FromSeconds(5);
                }
                finally
                {
                    deferal.Complete();
                }
            };

As you may see in my sample I use CodecSubtypes.VideoFormatMjpg and hardcoded path to jpeg file that I permanently use as MediaStreamSample. We need to research which CodecSubtypes to set to use RGBA (4 byte per pixel) format bitmap like thin

var buffer = new Windows.Storage.Streams.Buffer(size.Width * size.Height * 4);
// latestBitmap is SoftwareBitmap
latestBitmap.CopyToBuffer(buffer);
args.Request.Sample = MediaStreamSample.CreateFromBuffer(buffer, timestamp);