I'm trying to use a C library (Aubio) to perform beat detection on some music playing from a MediaPlayer
in Android. To capture the raw audio data, I'm using a Visualizer
, which sends a byte buffer at regular intervals to a callback function, which in turn sends it to the C library through JNI.
I'm getting inconsistent results (i.e. almost no beats are detected, and the only ones who are are not really consistent with the audio). I've checked multiple times, and, while I can't exactly rule out what I'm doing on my own, I'm wondering how exactly the Android Visualizer
behaves, since it is not explicit in the documentation.
- If I set the buffer size using
setCaptureSize
, does that mean that the captured buffer is averaged over the complete audio samples? For instance, if I divide the capture size by 2, will it still represent the same captured sound, but with 2 times less precision on the time axis? - Is it the same with the capture rate? For instance, does setting twice the capture size with half the rate yield the same data?
- Are the captures consecutive? To put it another way, if I take too long to process a capture, are the sounds played during the processing ignored when I receive the next capture?
Thanks for your insight!