2

I have an AudioBuffer stored as a variable, and I would like to have it be played by an Audio element. Here is my current non-functioning code:

const blob = new Blob(audioBuffer.getChannelData(1), { type: "audio/wav" });
const url = window.URL.createObjectURL(blob);
audioElement.src = url;

When I try to play audioElement, I get the following error:

Uncaught (in promise) DOMException: The element has no supported sources.

Does anyone have any ideas on how to solve this? Thanks in advance!

matthias_h
  • 11,356
  • 9
  • 22
  • 40

1 Answers1

0

AudioBuffer is PCM data, not encoded as WAV yet. If you need WAV you should get a library to do the encoding for you, such as https://www.npmjs.com/package/audiobuffer-to-wav

After including above code (you can just copy the audioBufferToWav function and the functions it calls below it out of index.js).

const blob = new Blob([audioBufferToWav(audioBuffer.getChannelData(1))], { type: "audio/wav" });
const url = window.URL.createObjectURL(blob);
audioElement.src = url;

Below using Web Audio API to playback the PCM AudioBuffer directly.

var myArrayBuffer = audioBuffer;
var audioCtx = new (window.AudioContext || window.webkitAudioContext)();
var source = audioCtx.createBufferSource();
source.buffer = myArrayBuffer;
source.connect(audioCtx.destination);
source.start();
user120242
  • 14,918
  • 3
  • 38
  • 52
  • I'm not really attached to WAV at all, would encoding to WAV be the simplest way to get this audiobuffer into the src of an audio element? – xphateslater Apr 16 '20 at 16:09
  • Yes. There may be a better way depending on what it is you are doing, such as using createMediaElementSource. But given an AudioBuffer that needs to be attached to an audio element, yes, encoding to a WAV is the simplest way to go about it. – user120242 Apr 16 '20 at 20:13