1

I am being trying for couple of days now to find a way to play audio which I am receiving from other Application(Unity Application) through WebRTC. First i will explain my situation.

I am creating a Window desktop application using .NetCore 3.1, and this application has a video chat part as well. This Desktop application suppose to receive audio,video and send only audio data.

I have to communicate with a unity app. For this i am using MixedReality-WebRTC. This is their api Documentation Microsoft.MixedReality.WebRTC. They have some Examples here User Manual

Using these i manage to receive the video data and play it Using a image source and a writable bit map.

As for the audio i am not sure how to handle it. This is the code part for receiving audio,

         pc.AudioTrackAdded += (RemoteAudioTrack track) => {
              track.AudioFrameReady += (AudioFrame frame) => {
                  //Console.WriteLine($" frames data: {frame.audioData}");
              };
         };

frame data is like this,

    // Summary:
    //     Single raw uncompressed audio frame.
    //
    // Remarks:
    //     The use of ref struct is an optimization to avoid heap allocation on each frame
    //     while having a nicer-to-use container to pass a frame accross methods.
    public ref struct AudioFrame
    {
        //
        // Summary:
        //     Buffer of audio samples for all channels.
        public IntPtr audioData;
        //
        // Summary:
        //     Number of bits per sample, generally 8 or 16.
        public uint bitsPerSample;
        //
        // Summary:
        //     Sample rate, in Hz. Generally in the range 8-48 kHz.
        public uint sampleRate;
        //
        // Summary:
        //     Number of audio channels.
        public uint channelCount;
        //
        // Summary:
        //     Number of consecutive samples in the audio data buffer. WebRTC generally delivers
        //     frames in 10ms chunks, so for e.g. a 16 kHz sample rate the sample count would
        //     be 1000.
        public uint sampleCount;
    }

I am receiving audio frames just have no clue about how to handle it and play audio.

I can useMarshal.Copy to Get the byte array. But i have to define a size for the output byte array.

Problems no i am having are these,

  • How do i know the size of the array?.
  • somehow get the byte array, or a stream and how to play it like in a real time voice chat app?
  • Or do I really need to get the stream or the Byte array from the IntPtr to play the audio?.
  • If so how do i do it?

I am kind of new to desktop application development using WPF. I will be grateful to any one who can give me sample code or point me in the right direction.
Thank you.

0 Answers0