I'm doing a pitch detection using a combination of an ACF and AMDF.
First I was using ACF in the time domain like this:
Get a buffer of 2048 samples
Window it (Hamming window)
sum=Sum(Buffer[i]*Buffer[i+lag]) for all i < 2048 - lag
acf = sum / 2048
And repeat the last 2 steps for all lags to be considered. (actually doing interpolation for non-integer lags)
Now I found that you can use FFT to calculate the ACF:
Get a buffer of 2048 samples
Window it (Hamming window)
fftBuf=fft(buffer)
buffer[i]=real(fftBuf[i])^2+imag(fftBuf[i])^2
fftBuf=fft(buffer) //ifft=fft for real signals
acfBuf = real(fftBuf) / 2048
Then actBuf[lag] is the ACF value at that lag.
I expected that the results will be the same or at least similar. But they are not. E.g for a 65.4Hz Sine wave (note C2) I get ~0.2 for a the corresponding lag of 674.25 using the time-domain approach and ~536.795 using the fft.
What did I miss? Or isn't both the same?