4

In a Blazor project I generate sound/noise signals as @DanW 's pink noise example, that generates a double[] with values between -1.0 - 1.0. Is it possible to play this array directly as audio in the browser? All I have found about sound/audio and browsers so far has been playing audio from an existing file.

EDIT: I am doing some filtering using some native dll's in C# and am more comfortable in c# than in javascript, hence trying to do most work in C# and not javascript.

Erik Thysell
  • 1,160
  • 1
  • 14
  • 31
  • Yeah, you convert raw data to a Blob in JavaScript, and then use it as the source of an Audio object. – Bennyboy1973 Oct 14 '21 at 08:54
  • It is interesting reading [the example](https://stackoverflow.com/questions/616897/how-can-i-make-a-pink-noise-generator) you linked to in the question. The bar for SO posts, specifically answers, was set pretty low back in the day. – fdcpp Oct 14 '21 at 09:49
  • I will be upfront and say I have no experience in using [Blazor](https://learn.microsoft.com/en-us/aspnet/core/blazor/?view=aspnetcore-5.0). As far as I can tell [**Blazor WebAssembly**](https://devblogs.microsoft.com/aspnet/blazor-webassembly-3-2-0-now-available/#what-is-blazor-webassembly) apps are downloaded by the browser then rendered. There is also the ability to interact with JavaScript via the [JSInterop functionality](https://learn.microsoft.com/en-us/aspnet/core/blazor/javascript-interoperability/call-javascript-from-dotnet?view=aspnetcore-5.0). – fdcpp Oct 14 '21 at 09:52
  • 1
    Given that Blazor appears to be primarily UI based, this suggests you'll have to wrap up the [WebAudio API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API) (which allows for real-time audio rendering) in a Blazor app. If you were looking for a pathway to get this to work I would first try and get a simple WebAudio page to work that renders and plays white noise. At the same time, I would try and get a simple Blazor app that calls something like `console.log()` via the JSInterop. The last stage should be just stitching these two parts together – fdcpp Oct 14 '21 at 09:55
  • 2
    All-in-all what this means is that the question in it's current state isn't best suited for SO. I'd have a look at the [`[blazor][javascript]`](https://stackoverflow.com/questions/tagged/blazor%2bjavascript?tab=Votes) tags to get some more insight, write a basic application that attempts to play pink noise (and no more) and should you still be stuck, you'll be in a better position to ask a better question. – fdcpp Oct 14 '21 at 09:57
  • 1
    One last piece of advice. I feel the SO tags are under utilised sometimes (they're really great!). Have a look at combining tags that relate to what it is you're trying to do. For instance [`[blazor] [audio] [javascript]`](https://stackoverflow.com/questions/tagged/blazor+audio+javascript) (Red Flag! no results!).[`[blazor] [audio]`](https://stackoverflow.com/questions/tagged/blazor+audio) (only this question!). [`[web-audio-api] [blazor]`](https://stackoverflow.com/questions/tagged/web-audio-api+blazor) (only post by [BennyBoy1973](https://stackoverflow.com/users/3433178/bennyboy1973)) – fdcpp Oct 14 '21 at 13:14
  • This is highly suggestive that you are trailblazing new ground OR you may be wielding Blazor in the wrong fashion. In either case, you're probably in for a little bit of work. – fdcpp Oct 14 '21 at 13:15

3 Answers3

5

So I managed with the WebAudio API to figure out how to:

Javascript:

// for cross browser compatibility
const AudioContext = window.AudioContext || window.webkitAudioContext;
audioCtx = {};

function initAudio() {
    // creation of audio context cannot not be done on loading file, must be done afterwards
    audioCtx = new AudioContext();
    return audioCtx.sampleRate;
}

// floats is a 1-D array with channel data after each other:
// ch1 = floats.slice(0,nSamples)
// ch2 = floats.slice(nSamples,nSamples*2)
function playNoise(floats, nChannels) {
    const bufferSize = floats.length / nChannels;
    let arrayBuffer = audioCtx.createBuffer(nChannels, bufferSize, audioCtx.sampleRate);

    for (var i = 0; i < nChannels; i++) {
        let f32arr = new Float32Array(floats.slice(i * bufferSize, (i + 1) * bufferSize));
        arrayBuffer.copyToChannel(f32arr, i, 0);
    }

    // Creating AudioBufferSourceNode
    let noise = audioCtx.createBufferSource();
    noise.buffer = arrayBuffer;
    noise.connect(audioCtx.destination);
    noise.start();
    return true;
}

Blazor page (I use Blazorise (Button)):

@page "/Tests"
@inject IJSRuntime js
<h3>Test</h3>
<Button Clicked="@(async () => await TestJS())" Color="Color.Primary">Test JS</Button>

@code {
    double fs;
    protected override async Task OnInitializedAsync()
    {
        fs = await js.InvokeAsync<double>("initAudio");
        await base.OnInitializedAsync();
    }

    private async Task TestJS()
    {
        var nChannels = 2;
        var nSecs = 5;
        var nSampels = (int)fs * nSecs * nChannels;
        var floats = new float[nSampels];
        var freq = 440;
        for (int i = 0; i < nSampels / nChannels; i++)
        {
            floats[i] = (float)Math.Sin(i * freq * 2 * Math.PI / fs);
            floats[i + nSampels / 2] = (float)Math.Sin(i * freq * 2 * 2 * Math.PI / fs);
        }
        var ok = await js.InvokeAsync<bool>("playNoise", floats, nChannels);
    }
}

The button plays a 440 Hz tone in the left channel (chanel 1) and a 880 Hz tone in the right channel (channel 2).

EDIT: Samplerate does not have to be the same as in the AudioContext. Check here for specs.

Erik Thysell
  • 1,160
  • 1
  • 14
  • 31
2

Blazor doesn't come pre-packaged with sound editing capabilities, but there is now JS isolation, which means that components can load required JS modules.

My advice is to do all your audio stuff directly in JavaScript using the WebAudio API etc. in a collection of custom-made components.

You could use C# to generate waveforms and so on, but I'm not sure there's any advantage in doing so, unless you want to use an existing C# library.

Bennyboy1973
  • 3,413
  • 2
  • 11
  • 16
  • I don't know why your was downvoted @Bennyboy1973 but you are right, it seems to be the way with webaudio API. I have some native dll's that are used in the filtering of the sound AND I just like (and are more comfortable with) c# tooling in VS :) – Erik Thysell Oct 15 '21 at 06:47
  • Not my downvote, but I would imagine it is because this is a little bit of a non-answer, [especially in light of the work OP has done](https://stackoverflow.com/a/69581603/8876321) – fdcpp Oct 15 '21 at 08:07
  • @fdcpp I could be mistaken, but I believe that work came after this post. I noticed someone had come in and downvoted every answer AND the OP question. Erik did very well, actually, and I wish there was some way to bookmark his answer because I suspect a lot of people will be asking about sound this year. – Bennyboy1973 Oct 15 '21 at 10:03
1

I've had no problem generating files using Java (Spring Framework) and downloading the raw PCM to Javascript variables in the html via Thymeleaf and playing them via the Web Audio API, for example, using an AudioBuffer object. Is there some aspect of Blazor that prevents or inhibits the use of Web Audio objects? IDK how HTML works when there are script tags designating different languages, if there is a way to communicate between them.

Phil Freihofner
  • 7,645
  • 1
  • 20
  • 41
  • Thanks @PhilFreihofner - I am looking into the web audio API, it looks like the way. I don't think it should be an issue with Blazor but a try will have to prove it. – Erik Thysell Oct 15 '21 at 06:44