I found some libraries which do instrument synthesizing with the Web Audio API.
One of them (Band.js) uses createOscillator()
, in combination with the oscillator type (sine, square, ...) see source.
But it sounds too synthetic (example to listen) I want something which sounds more realistic, but I don't want to use any precompiled soundfonts, so it should be synthesized. It should also work on a mobile device.
So I found another library (musical.js) which uses the first 32 harmonics as an matrix in combination with the createPeriodicWave
see source. The timbre is awesome, you can listen to it
As it is written in the comment of the source code, the harmonics are taken from this piano sample file. There are much more sample files of other instruments. I tried to replace the harmonics, even all 2000, but it sounds always like a piano.
There are also some values to adjust and interpolate the harmonics and ADSR values. Maybe they are only optimized for a piano sound?
Then I found another library (guitar-synth) which has a really nice timbre for a guitar, listen to it. But this library don't use any createPeriodicWave
API. Instead it uses createScriptProcessor
and getChannelData
in combination with some "simple" calculations, but nothing like the harmonics at the other library, see source
So my main question
Can the guitar synthesizer be ported to use the createPeriodicWave
API?
I want to use the guitar timbre in musicaljs. So that I can switch between the piano timbre and guitar timbre.
BTW: Found another library which synthesize sound instruments. Here is the demo and here the source. The sound is also nice, but the musical.js library has a much more beautiful timbre. But it also looks like it uses something similar like getChannelData
just encoded as WAVE. It also doesn't work on my mobile device with Android.