1

I have a .net core WebSocket server that receives live stream audio from client A, then I need to stream this live audio to client B (Browser). So I've received the byte array from client A, and I sent the byte array to client B (Browser) *The byte array is correct as I can convert it into .wav and play it without a problem.

In client B (Browser), I try to decode the array buffer into the audio buffer so it can be put into output and play.

The mediastreamhandler.SendArraySegToAllAsync is where I start to send out the byte array from the server to client B. I use to send to all method 1st, later will be modified and send out data by matching websocket connection ID.

private async Task Echo(HttpContext context, WebSocket webSocket)
        {
            Debug.WriteLine("Start Echo between Websocket server & client");
            var buffer = new byte[1024 * 4];
           
            WebSocketReceiveResult result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);

            while (!result.CloseStatus.HasValue)
            {
                await webSocket.SendAsync(new ArraySegment<byte>(buffer, 0, result.Count), result.MessageType, result.EndOfMessage, CancellationToken.None);

                result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);

                await mediastreamhandler.SendArraySegToAllAsync(new ArraySegment<byte>(buffer, 0, result.Count));

            }
            Debug.WriteLine("Close Echo");
            await webSocket.CloseAsync(result.CloseStatus.Value, result.CloseStatusDescription, CancellationToken.None);
        }

I then receive the audio byte array through websocket.onmessage in Javascript. Then I pass the byte array to decode and play. But here it says "unable to decode data". While in Mozilla, it did says that content format was unknown (Do I need to reformat the byte array that I receive?) The byte array itself is fine because I've used the same byte to create .wav file locally and play it without any problem.

var ctx = new AudioContext();

    function playSound(arrBuff) {

        var myAudioBuffer;

        var src = ctx.createBufferSource();

        ctx.decodeAudioData(arrBuff, function (buffer) {
            myAudioBuffer = buffer;
        });

        src.buffer = myAudioBuffer;
        src.connect(ctx.destination);
        src.start();

    }

I then try another method to decode and play the audio, this time, it has played out some whitenoise sounds instead of streaming out the audio from Client A.

var ctx = new AudioContext();

    function playSound(arrBuff) {

        var myAudioBuffer;

        var src = ctx.createBufferSource();

        myAudioBuffer = ctx.createBuffer(1, arrBuff.byteLength, 8000);
        var nowBuffering = myAudioBuffer.getChannelData(0);
        for (var i = 0; i < arrBuff.byteLength; i++) {
            nowBuffering[i] = arrBuff[i];
        }

        src.buffer = myAudioBuffer;
        src.connect(ctx.destination);
        src.start();

    }

I think I really need some help over here guys, trying to play out the array buffer in weeks and still, couldn't have any breakthrough. Stuck here. Not sure what I've done, could you guys kindly guide me or tell me any other approach to this? Thanks in advance very much, really mean it.

sn-
  • 466
  • 1
  • 5
  • 15
J.J
  • 13
  • 1
  • 6
  • 1
    You can't really stream audio this way reliably. You're going to end up with gaps and skips in your audio. Somewhat relevant: https://stackoverflow.com/a/64040169/362536 Skip the web socket route and just stream over HTTP. Or, if latency matters, implement WebRTC. – Brad Nov 01 '20 at 17:13
  • Hi Brad, thanks for the guide. However my client using Twilio service, and Twilio service need websocket to send over the live audio. Cant really skip the web socket route. – J.J Nov 02 '20 at 13:22
  • I'll need to either decode the chunk data from websocket and play in browser. Or process the data into "complete" byte array 1st before sending to browser, so decodeAudioData() can actually "understand" the byte array. – J.J Nov 02 '20 at 13:47
  • I am facing this issue at the moment working with Twilio Media Streams, how did you get it solved? I want to listen to the audio stream of a call right from the broswer. – Tayormi Jan 13 '21 at 14:09
  • Hi Tayomi, I solve it by sending strings payload from server to client. Then add wavheader to the payload before I play the audio. – J.J Jan 18 '21 at 03:13
  • function playSound(payloadBase64) { var Base64wavheader = "UklGRgAAAABXQVZFZm10IBIAAAAHAAEAQB8AAEAfAAABAAgAAABmYWN0BAAAAAAAAABkYXRh"; var audio = new Audio('data:audio/wav;base64,' + Base64wavheader + payloadBase64); audio.play(); }; – J.J Jan 18 '21 at 03:13
  • You can generate your wavheader here, just remember to tune channels, sample rate, and bits accordingly. https://codepen.io/mxfh/pen/mWLMrJ – J.J Jan 18 '21 at 03:14

2 Answers2

3

decodeAudioData() requires complete files, so it can't be used to decode partial chunks of data as they are received from a websocket. If you can stream Opus audio files over your websocket, you can play back with an available WebAssembly decoder. See:

anthumchris
  • 8,245
  • 2
  • 28
  • 53
  • Thanks for your source code Anthum, I'll take a look on it! :D Just a quick question, is your example including decode the partial chunks of data from websocket into audio? Or process the byte array into "complete" data before sending to decodeAudioData() for decoding. – J.J Nov 02 '20 at 13:48
  • The example does not use websockets, but you can derive the websocket implementation from seeing how that code works. Please see the [README](https://github.com/AnthumChris/fetch-stream-audio/blob/master/README.md) regarding `decodeAudioData()` – anthumchris Nov 02 '20 at 15:36
  • Thank you Anthum, your example did guide me in the right way. – J.J Nov 05 '20 at 03:22
0

I've solved the issue months ago, just here to post my solution.

Steps:

  1. Server receive payload string from Twilio
  2. Send payload string from server to client (browser).
 public async Task SendMessageAsync(WebSocket socket, string message)
 {
     if (socket.State != WebSocketState.Open)
         return;

     await socket.SendAsync(buffer: new ArraySegment<byte>(array: Encoding.ASCII.GetBytes(message),
                                                                  offset: 0,
                                                                  count: message.Length),
                                   messageType: WebSocketMessageType.Text,
                                   endOfMessage: true,
                                   cancellationToken: CancellationToken.None);
 }
  1. Add wavheader to payload string at the client side before play out the audio.
function playSound(payloadBase64) {

     /* You can generate the wav header here --> https://codepen.io/mxfh/pen/mWLMrJ */
     var Base64wavheader = "UklGRgAAAABXQVZFZm10IBIAAAAHAAEAQB8AAEAfAAABAAgAAABmYWN0BAAAAAAAAABkYXRh";

     var audio = new Audio('data:audio/wav;base64,' + Base64wavheader + payloadBase64);
     audio.play();
};
J.J
  • 13
  • 1
  • 6