4

I'm trying to capture audio output from the browser and save a recorded file in JavaScript (without using 3rd party apps or browser extensions).

After reviewing the examples at WebRTC samples this task seems to be relatively straightforward when capturing audio from a user's microphone using the MediaStream output of getUserMedia().

Is there a way to capture a MediaStream that is just the browser's audio output? Or is there some better way to access the browser's audio output in a way that can be recorded to a file?

For context, my audio output in the browser may originate from one of several audio libraries (Tone.js, for example) so I'd rather not rely on generating the audio file from the JS library that is generating the audio. I've looked into writing a file from the AudioContext, but I am trying to find some solution that would be audio-source-agnostic.

antzshrek
  • 9,276
  • 5
  • 26
  • 43
  • Can you define "browser's audio output"? – zero298 Jun 08 '18 at 16:53
  • 1
    Sure, I guess the simplest definition would be "anything that gives chrome the audio playing indicator (https://d0od-wpengine.netdna-ssl.com/wp-content/uploads/2013/11/Screen-Shot-2013-11-05-at-13.40.21.png)" but a more specific answer would be, javascript libraries that instruct the browser to play sound. This is the closest analog I could find: https://stackoverflow.com/questions/46468805/how-to-save-sound-with-web-audio-api-and-tone-js-in-a-browser – vaultdweller Jun 08 '18 at 17:06

0 Answers0