Wave file: 44100 Hz, 16 bit, dual channel.
I use FFT to calculate magnitude at each frequency bins of output. But I don't know to scale it to draw (real-time) spectrum.
Anybody can help me ?
Wave file: 44100 Hz, 16 bit, dual channel.
I use FFT to calculate magnitude at each frequency bins of output. But I don't know to scale it to draw (real-time) spectrum.
Anybody can help me ?
well; there are a number of ways to do this...
for instance: if you want a dB scale, for each imaginary sample, compute
ymag = (x.real^2 + x.imag^2)
you'll only want to go through half the array because you want the positive frequencies; the second half will just be a repeat of the first with real data fed to the FFT.
search through the resulting values for the minimum and maximum values and store them. if your minimum value is zero, choose some very small value to instead be your minimum. (0.000001 or something). then, set your minimum dB value as mindB = 10 * log10(minimum).
now, the first value returned (sample[0]) will be your DC offset, which you will probably want to set to zero.
then, for each sample, compute: ydB = 10 * log10(ymag / maximum).
this should give you an array that represents the dB down from max of each sample bin. you can scale this to whatever you need; if your plot area goes from y=5 to y=200 you could use something like:
yscaled = ((ydB / -mindB) * (200 - 5) + 200)
i would also ensure that the scaled value fits in the bounds in case there were a FP roundoff error.
yscaled = min(max(yscaled, 5),200)
it's been a while since i did this so i apologize if there are any math errors. :)
Different FFT implementations have different scale factors, perhaps differing by N, 1/N, or 1/sqrt(N), where N is the length of the FFT. For at least one kind of signed integer input FFT, max scale is around sqrt(2) * N * 2^(b - 1), where b is the number of bits to the left of the decimal point (16 in your case, maybe 17 if you sum the channels into a larger data type before the FFT).