I'm currently working on a web-based collaborative recording platform for musicians, something like a basic DAW ported to the web (with extra social/sharing features). Anyway, my goal is to make it 100% flash-free, so I've been reading a lot about HTML5 and, in particular, the Web Audio API (this book helped a lot, btw).
To record audio from the user's microphone, using getUserMedia(), I made a custom version of RecorderJS. In a nutshell, I'm routing the output from getUserMedia() to a ScriptProcessorNode which, every 4096 samples, writes the contents of the inputBuffer to an array that is later exported to a PCM WAV file. So far, it works fine.
The problem is that the start of the recording procedure involves two things: playing all the previously recorded tracks, so the musician has a reference to play on top of, and starting the actual recording (writing the buffer to the array, that is).
Although there is no audible latency or delay from the sound of the microphone when the user is recording, when the recording ends and all the tracks are played, the newly recorded track has a slight delay.
What can be causing this? What are the possible solutions?
I thought I could find the time difference between both events by also sending the playback to the same processor node and then find out when do they actually begin, to compensate any delay. For this, I would need to have the ScriptProcessorNode receiving, for example, getUserMedia stuff on channels 1 and 2, and playback on channels 3 and 4, but I can't make this work. I tried routing both sources to the processor node and I also tried with a Merger/Splitter, but nothing seems to work. They all reach the processor node on channels 1 and 2, while 3 and 4 come empty.
Sorry if this is off-topic or doesn't contain actual code (which I am more than happy to provide if necessary), but there is not much stuff done on this area and so any ideas would be very welcome.
Thanks in advance!