0

I am using getUserMedia and mediaRecorder API to record an video from webcam.

I am using chrome version 80.

How to getUserMedia and record the video mixing the mp3 with javascript? mp3 can play pause and stop

I don't know how to mixing the mp3 to the video stream on live.

When I removeTrack and addTrack, I stop on MediaRecorder fail.

show Error: Failed to execute 'stop' on 'MediaRecorder': The MediaRecorder's state is 'inactive'.

my code on codepen: https://codepen.io/zhishaofei3/pen/eYNrYGj

and prime codes:

function getFileBuffer(filepath) {
  return fetch(filepath, {method: 'GET'}).then(response => response.arrayBuffer())
}

function mp3play() {
  getFileBuffer('song.mp3')
  .then(buffer => context.decodeAudioData(buffer))
  .then(buffer => {
    console.log(buffer)
    const source = context.createBufferSource()
    source.buffer = buffer
    let volume = context.createGain()
    volume.gain.value = 1
    source.connect(volume)
    dest = context.createMediaStreamDestination()
    volume.connect(dest)
    // volume.connect(context.destination)
    source.start(0)

    const _audioTrack = stream.getAudioTracks();
    if (_audioTrack.length > 0) {
      _audioTrack[0].stop();
      stream.removeTrack(_audioTrack[0]);
    }

    console.log(dest.stream)
    console.log(dest.stream.getAudioTracks()[0])
    stream.addTrack(dest.stream.getAudioTracks()[0])
  })
}

thank you !

zhishaofei3
  • 117
  • 1
  • 5

1 Answers1

1

Many containers don't support adding/removing tracks like that, and it's doubtful the Media Recorder API does at all. It's an unusual thing to do.

You need to create the stream you're going to record before instantiating Media Recorder, with all of the tracks you want. Therefore, you need to do things in this order:

  1. Set up your AudioContext.
  2. Call getUserMedia(). (And while you're at it, set audio: false in your constraints. No need to open a microphone if you're not using one.)
  3. videoStream.getVideoTracks() and dest.stream.getAudioTracks() to get all of the tracks.
  4. Create a new MediaStream with those tracks. new MediaStream([audioTrack, videoTrack])

Now, run your MediaRecorder on this new MediaStream and you'll have what you want.

Brad
  • 159,648
  • 54
  • 349
  • 530
  • My demand scenario is the online classroom used by the teacher for the class. The teacher can talk throughout the course, but can play mp3 audio during the process, but the mp3 audio needs to be mixed with the teacher's audio and video. What do you think is feasible? The teacher speaks throughout the lesson, so I need to turn on the microphone at the beginning. – zhishaofei3 Mar 16 '20 at 02:28
  • @zhishaofei3 Oh, in that case you simply need to actually mix that microphone in your Web Audio API graph. Use `context.createMediaStreamSource()` *or* a `srcObject` on a media element set to the media stream, `createMediaElementSource()`. I use the latter when I want to have some sort of webcam preview for the user. – Brad Mar 16 '20 at 04:20
  • But my requirement is to generate a video file for the user to download and view the playback after the class – zhishaofei3 Mar 16 '20 at 05:39
  • 1
    I understand your requirement. If you want to mix audio, you need to do it through the Web Audio API. You can take your video track, add it to the output Media Stream, take the microphone audio track, create a new Media Stream for it, add it to the Web Audio graph, then take the output of the Web Audio graph as a Media Stream, then extract the audio track from the Web Audio graph and the video track from the camera Media Stream, combine together as a new Media Stream to send to the Media Recorder. – Brad Mar 16 '20 at 05:54
  • pretty much confused regarding somewhat the same issue. I need to mix 2 streams. One from `canvas.captureStream()` other audio from a mp3. I was trying to create a buffer. But there are so many classes. I am confused a bit! I need to use Web Audio API that's for sure to get the audio stream – Ashitaka Apr 28 '20 at 06:20
  • @NilanjanRoy You don't need to mix two streams in your case... there's only one audio stream. You need to mux two tracks into one stream. So, use the Web Audio API and a MediaStreamAudioDestinationNode, and then create a new MediaStream using steps # and #4 from my answer. – Brad Apr 28 '20 at 14:32