6

I am trying to record and save sound clips from the user microphone using the GetUserMedia() and AudioContext APIs.

I have been able to do this with the MediaRecorder API, but unfortunately, that's not supported by Safari/iOS, so I would like to do this with just the AudioContext API and the buffer that comes from that.

I got things partially working with this tutorial from Google Web fundamentals, but I can't figure out how to do the following steps they suggest.

var handleSuccess = function(stream) {
    var context = new AudioContext();
    var source = context.createMediaStreamSource(stream);
    var processor = context.createScriptProcessor(1024, 1, 1);

    source.connect(processor);
    processor.connect(context.destination);

    processor.onaudioprocess = function(e) {
        // ******
        // TUTORIAL SUGGESTS: Do something with the data, i.e Convert this to WAV 
        // ******
        // I ASK: How can I get this data in a buffer and then convert it to WAV etc.??
        // *****
        console.log(e.inputBuffer);
    };
};

navigator.mediaDevices.getUserMedia({ audio: true, video: false })
    .then(handleSuccess);

As the tutorial says:

The data that is held in the buffers is the raw data from the microphone and you have a number of options with what you can do with the data:

  • Upload it straight to the server
  • Store it locally
  • Convert to a dedicated file format, such as WAV, and then save it to your servers or locally

I could do all this, but I can't figure out how to get the audio buffer once I stop the context.

With MediaRecorder you can do something like this:

mediaRecorder.ondataavailable = function(e) {
    chunks.push(e.data);
}

And then when you're done recording, you have a buffer in chunks. There must be a way to this, as suggested by the tutorial, but I can't find the data to push into the buffer in the first code example.

Once I get the audio buffer I could convert it to WAV and make it into a blob etc.

Can anyone help me with this? (I don't want to use the MediaRecorder API)

Adam D
  • 1,962
  • 2
  • 21
  • 37
  • Hi Adam, I met the same problem too, did you solved the problem above? Would you mind to share your codes for AudioContext API? Thanks. – New Hand Feb 14 '20 at 08:34
  • @NewHand I'm sorry I never heard of a solution, and I had to use the [audio-recorder-polyfill](https://github.com/ai/audio-recorder-polyfill) – Adam D Feb 16 '20 at 18:37
  • Oh, well noted! many thanks!! – New Hand Feb 17 '20 at 03:54
  • BaseAudioContext.createScriptProcessor() is deprecated. This feature was replaced by AudioWorklets and the AudioWorkletNode interface. – audiomason Nov 13 '21 at 21:08

1 Answers1

0
e.inputBuffer.getChannelData(0)

Where 0 is the first channel. This should return a Float32Array with the raw PCM data, which you can then convert to an ArrayBuffer e.inputBuffer.getChannelData(0).buffer and send to a worker that would convert it to the needed format.

.getChannelData() Docs: https://developer.mozilla.org/en-US/docs/Web/API/AudioBuffer/getChannelData.

About typed arrays: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Typed_arrays, https://javascript.info/arraybuffer-binary-arrays.

  • 2
    While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. – Vimal Patel May 06 '21 at 05:42