I have a single-channel wave coming in at an 8000 Hz sampling rate.
I need to analyze frequencies that are between 5 Hz and 300 Hz in real-time, with emphasis on signals from 10 to 60 Hz.
My thought initially is to run the 8000 Hz sample into a buffer, collecting about 32000 samples. Then, run a 32000 window-sized fourier transform on it.
The reasoning here is that for lower-frequency signals, you need a larger window size (right?)
However, if I'm trying to display this signal in real-time, it seems like the AudioAnalyserNode might not be a good choice here. I know the WebAudio API would allow me to get the raw data, but ideally the AudioAnalyserNode would be able to run a new fft based on the previous 32000 samples, even if a smaller amount of samples have become newly available. At this point, it seems like the fft data is only updating once every four seconds.
Do I have to create a special "running bin" so that the display updates more frequently than once every 4 seconds? Or, what's the smallest window size I can use to still get reasonable values in this range? Is 32000 a large enough window size?
I am using the WebAudio API analyser node in javascript, but if I have to get the raw data, I'm also willing to change libraries to another one in javascript.