0

I have used web audio api to connect a microphone to a convolver to an analyser to a flot gui to plot the spectrum. For testing I set the buffer of the convolver to be unity but I don't get any output. If I bypass the convolver and connect the mic directly to the analyser it works. Can you please help?

In the code below use_convolver determines if to bypass convolver or not.

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">

    <head>
        <meta charset="utf-8">

        <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.4/jquery.min.js"></script>
        <script src="http://www.flotcharts.org/flot/jquery.flot.js" type="text/javascript"></script>
    </head>

    <body>
        <h1>Audio Spectrum</h1>
        <div id="placeholder" style="width:400px; height:200px; display: inline-block;">

        </div>
        <script>
            var microphone;
            var analyser;
            var convolver;
            //user media
            navigator.getUserMedia = (navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia);
            if (navigator.getUserMedia) {
                console.log('getUserMedia supported.');
                navigator.getUserMedia(
                // constraints - only audio needed for this app
                {
                    audio : true,
                    echoCancellation : true
                },
                // Success callback
                user_media_setup,
                // Error callback
                function(err) {
                    console.log('The following gUM error occured: ' + err);
                });
            } else {
                console.log('getUserMedia not supported on your browser!');
            };

            function user_media_setup(stream) {
                console.log('user media setup');
                // set up forked web audio context, for multiple browsers
                // window. is needed otherwise Safari explodes
                audioCtx = new (window.AudioContext || window.webkitAudioContext)();
                //microphone
                microphone = audioCtx.createMediaStreamSource(stream);

                //analyser
                analyser = audioCtx.createAnalyser();
                analyser.fftSize = 1024;
                analyser.smoothingTimeConstant = 0.85;

                //convolver
                convolver = audioCtx.createConvolver();
                convolver.normalize = true;
                convolverBuffer = audioCtx.createBuffer(1, 1, audioCtx.sampleRate);

                //    convolverBuffer[0] = 1; //wrong

                convolverChannel = convolverBuffer.getChannelData(0);
                convolverChannel[0] = 1;
                convolver.buffer = convolverBuffer;

                //connectivity
                var use_convolver = false;
                if (use_convolver) {
                    //through convolver:
                    microphone.connect(convolver);
                    convolver.connect(analyser);
                } else {
                    //direct:
                    microphone.connect(analyser);

                }
                visualize();
            }

            function visualize() {
                console.log('visualize');
                dataArray = new Float32Array(analyser.frequencyBinCount);
                draw = function() {
                    analyser.getFloatFrequencyData(dataArray);
                    var data = [];
                    for (var i = 0; i < dataArray.length; i++) {
                        freq = audioCtx.sampleRate * i / dataArray.length / 2;
                        data.push([freq, dataArray[i]]);
                    }
                    var options = {
                        yaxis : {
                            min : -200,
                            max : 0
                        }
                    };
                    $.plot("#placeholder", [data], options);
                    window.requestAnimationFrame(draw);
                };
                window.requestAnimationFrame(draw);
            }

        </script>
    </body>
</html>
Hanan Shteingart
  • 8,480
  • 10
  • 53
  • 66

2 Answers2

1

convolverBuffer[0] is the wrong way to get at the sample data in the buffer. You need to call convolverBuffer.getChannelData(0) to get the sample array to modify.

aldel
  • 6,489
  • 1
  • 27
  • 32
  • Thanks @aldel. I've corrected this but it still doesn't work in Chrome. In Firefox it does work. My problem with Firefox is that it seems as if there is a build-in filter with a cut off of about 8 kHz (before I apply the any filtering or convolution) – Hanan Shteingart Jul 24 '15 at 12:23
  • 1
    @HananShteingart, hmm, it seems like Chrome doesn't work when you use a 1-channel convolver buffer. If you use a 2-channel buffer, it works. This seems to me like a bug in Chrome; you should probably report it. – aldel Jul 24 '15 at 19:49
  • 1
    You might also be better off with normalize = false. Normalization is weirdly defined, and you will get a much lower level on the output than the input if you normalize. – aldel Jul 24 '15 at 19:51
0

@aldel This problem was bugging me for a few frustrating days .. much thanks for this tip. I can confirm that this is an issue in firefox as well. It seems if you use a mono WAV file as the buffer for the convolver, you will not get any output from the convolver.

After I switched to a stereo WAV impulse response as the buffer, the convolver worked.

Also a tip I learned today is that firefox's web audio tools (enabled by clicking the gear in the top right section of the firefox dev tools, and checking 'web audio' over on the left) are really useful for visualizing the order of your nodes. And you can easily switch a node on/off (bypass it) to see if that's causing problems in your audio context.

tyler-g
  • 509
  • 1
  • 6
  • 15
  • It also appears that if you have a null buffer on a ConvolverNode, it will not output any audio. That is not exactly clear in the web audio docs, and behaves a little differently than other nodes (which can be created and by default audio will pass through them). – tyler-g Apr 04 '16 at 20:39