0

EDIT 3:

It was only a Firefox issue, works in Chrome, so problem solved, see the answer below. Thank you Chris for your help!

EDIT 2:

Following Chris's advice I changed one line on getUserMedia call, but it doesn't works for now, maybe I'm using a wrong syntax but this feature is undocumented:

if(navigator.getUserMedia){
  navigator.getUserMedia(
    { audio: { optional: [{ echoCancellation: false }] } }
    ,function(stream){ init_stream(stream); }
    ,function(err){ console.log('The following gUM error occured: ' + err); }
  );
}

Also, you can now follow progresses here:

http://jsfiddle.net/stratboy/aapafrbu/1/

EDIT 1:

I'm currently playing a keyboard > mixer > behringer UCA222 > mac (usb). My current code to see some data is the following. I see data changing for channel left but not for channel right, and despite what I'm doing on the mixer. What can be the reason?

window.AudioContext = window.AudioContext || window.webkitAudioContext;

navigator.getUserMedia = (navigator.getUserMedia ||
                          navigator.webkitGetUserMedia ||
                          navigator.mozGetUserMedia ||
                          navigator.msGetUserMedia);


var audiocontext = new (window.AudioContext || window.webkitAudioContext)();
var analyser_left = audiocontext.createAnalyser();
var analyser_right = audiocontext.createAnalyser();
var splitter = audiocontext.createChannelSplitter(2);
var index = 0;

function init_stream(stream){
  window.audiosource = audiocontext.createMediaStreamSource(stream);

  audiosource.connect(splitter);
  splitter.connect(analyser_left,0);
  splitter.connect(analyser_right,1);

  listen();
}

function listen(){
  requestAnimationFrame(listen);

  analyser_left.fftSize = 256;
  var leftBufferLength = analyser_left.frequencyBinCount;
  var leftDataArray = new Uint8Array(leftBufferLength);

  analyser_left.getByteTimeDomainData(leftDataArray);
  $('.monitor_left').html(JSON.stringify(leftDataArray));


  analyser_right.fftSize = 256;
  var rightBufferLength = analyser_right.frequencyBinCount;
  var rightDataArray = new Uint8Array(rightBufferLength);

  analyser_right.getByteTimeDomainData(rightDataArray);
  $('.monitor_right').html(JSON.stringify(rightDataArray));

}

if(navigator.getUserMedia){
  navigator.getUserMedia(
    { audio: true }
    ,function(stream){ init_stream(stream); }
    ,function(err){ console.log('The following gUM error occured: ' + err); }
  );
}

I'd like to play my guitar into the computer and analyze the sounds via the web audio API. I know it's possible to use the microphone, but what about a real instrument plugged in?

Luca Reghellin
  • 7,426
  • 12
  • 73
  • 118
  • You'd have to get a preamp, plug your guitar into the preamp and then take the output from the preamp and plug it into your mic input. – idbehold Sep 13 '15 at 15:50

2 Answers2

2

Yes. I do this all the time (use non-mic sources). Just get a USB audio interface that supports guitar/instrument input.

cwilso
  • 13,610
  • 1
  • 30
  • 35
  • Wow! I was almost sure I couldn't.. And 2 more questions: what of the js classes/interfaces do you use to get the sound from usb? From MDN I understood how to get the microphone but it's not yet clear to me what's the correct code to get usb sources. And q#2: is it possible to separately analize 2 simultaneous sources (ex. usb + mic)? I'd like to do a live show with different visuals for the [keyboards: source 1] and the other instruments [guitar+bass+sax: source 2]. – Luca Reghellin Sep 14 '15 at 08:20
  • 1
    You can select the input device through getMediaDevices - I use it in the input effects demo (https://webaudiodemos.appspot.com/input/index.html) - relevant code is kicked off from https://github.com/cwilso/Audio-Input-Effects/blob/master/js/effects.js#L216-L220. (Search for gotSources to see what it does.) – cwilso Sep 15 '15 at 16:03
  • On analyzing separate sources - Chrome, at least, currently only supports mono and stereo sources; you could, of course, feed your keyboard and other instruments through separate channels of a stereo input and do it that way. (You'll need to use a ChannelSplitterNode to split into two outputs and run each through a separate AnalyserNode.) Multichannel (more than stereo) inputs are a bug we're working on. Multi _device_, on the other hand - using your headphone mic AND a USB audio input device at the same time - is a lot more complex, and I don't expect to work in the near future. – cwilso Sep 15 '15 at 16:06
  • Thank you very much, I'll go for the stereo way. And your demo is pretty much impressive!! :) – Luca Reghellin Sep 16 '15 at 08:53
  • cwilso: to your attention, if you can/want/have time/etc.. ;) http://stackoverflow.com/questions/32915635/web-audio-api-stream-why-isnt-dataarray-changing – Luca Reghellin Oct 03 '15 at 13:47
  • Hi! I added some code above. I still can't get the 2 channels, I can't figure out why. Could you please see If I'm doing something wrong? :) – Luca Reghellin Oct 04 '15 at 16:12
  • You mean you're sending a stereo signal, and you're getting zeros in the right analyser? – cwilso Oct 05 '15 at 16:18
  • Hi Chris, thank you. Not exactly: I send a stereo signal (I think, since I'm going out of the mixer with an RCA cable into the uca222, then out to the mac with usb) but don't have zeros on the right channel. I get all 128, and they stand still when I play, while on the left channel I see changes. In fact, I get the very same with the computer's microphone (which I guess it's mono), so it seems it doesn't split the signal. – Luca Reghellin Oct 06 '15 at 07:12
  • One thing, to clarify: if I play and rec for example with Audacity, I can clearly see the 2 channels, and differences for example on gain or eq. So I'm pretty sure I've got a stereo signal input. So, it only can be an error on my code (easily) or maybe some bug on the API (I dubt). So how about the fiddle above? :) – Luca Reghellin Oct 07 '15 at 18:56
  • Hmm, I think this may be the built-in audio processing (http://crbug.com/503493). Try disabling echoCancellation in your getUserMedia call: pass "{ audio: { optional: [{ echoCancellation: false }] } };" as the first parameter to gUM. – cwilso Oct 08 '15 at 19:30
  • Mmm, please see the code above or this [fiddle] (http://jsfiddle.net/stratboy/aapafrbu/1/): don't know if the syntax maybe is wrong, but it seems it still doesn't work. And by the way: that feature seems undocumented: where can I find more (good/updated) docs other than MDN? – Luca Reghellin Oct 09 '15 at 07:13
  • Ok, in fact, it works even with the original script (http://jsfiddle.net/stratboy/aapafrbu/)! It works only on Chrome, so It's only a Firefox issue. Not a problem for me since it's not a public project. So, problem solved! Chris, thank you very much for your help! – Luca Reghellin Oct 10 '15 at 11:08
0

So, the original question was about connecting more instruments/real time sources (for example an instrument and a microphone) to the web audio api and analyze the stream.

The answer is almost yes :P As Chris wrote, it's not possible for now to get several instruments, but It's possible to split a stereo signal! So what I did is going through a mixer with 2 sources (say, a keyboard and a microphone), put one on the left channel and one on the right. Then connect to an USB audio card of some kind (I'm currently using a cheap behringer UCA222).

Turns out that Firefox seems still not to be able to split the signal, but Chrome can do it, and it's enough for me. Some working code is this and it's quite self-explanatory:

window.AudioContext = window.AudioContext || window.webkitAudioContext;

navigator.getUserMedia = (navigator.getUserMedia ||
                          navigator.webkitGetUserMedia ||
                          navigator.mozGetUserMedia ||
                          navigator.msGetUserMedia);


var audiocontext = new (window.AudioContext || window.webkitAudioContext)();
var analyser_left = audiocontext.createAnalyser();
var analyser_right = audiocontext.createAnalyser();
var splitter = audiocontext.createChannelSplitter(2);
var index = 0;

function init_stream(stream){
  window.audiosource = audiocontext.createMediaStreamSource(stream);

  audiosource.connect(splitter);
  splitter.connect(analyser_left,0);
  splitter.connect(analyser_right,1);

  listen();
}

function listen(){
  requestAnimationFrame(listen);

  analyser_left.fftSize = 256;
  var leftBufferLength = analyser_left.frequencyBinCount;
  var leftDataArray = new Uint8Array(leftBufferLength);

  analyser_left.getByteTimeDomainData(leftDataArray);
  $('.monitor_left').html(JSON.stringify(leftDataArray));


  analyser_right.fftSize = 256;
  var rightBufferLength = analyser_right.frequencyBinCount;
  var rightDataArray = new Uint8Array(rightBufferLength);

  analyser_right.getByteTimeDomainData(rightDataArray);
  $('.monitor_right').html(JSON.stringify(rightDataArray));

}


if(navigator.getUserMedia){
  navigator.getUserMedia(
    { audio: true }
    ,function(stream){ init_stream(stream); }
    ,function(err){ console.log('The following gUM error occured: ' + err); }
  );
}

The final fiddle to test is here: jsfiddle.net/stratboy/aapafrbu

You can see bits changing while playing.

As a web programmer one thing I didn't understand is that there are no 'onAudioSomething' events to catch say, when a single a note is playied on a keyboard. But maybe it's quite logical, since I guess such kind of events would often be useless on a piece of music, as usually there are no 'zero points' on the audio gain. So the way one can analyse the source is by polling via requestAnimationFrame().

Hope it helps some other explorer out there :)

Luca Reghellin
  • 7,426
  • 12
  • 73
  • 118