0

I'm building an iOS app using the SuperpoweredFrequencies project as an example. Everything is working great. I've increased the number of bands to 55 and experimented with widths of 1/12 and 1/24 to tighten up the filtering range surrounding the individual frequencies in question.

I've noticed something when testing with a musical instrument, that when I play lower notes, starting approximately with A 110 that the amplitudes of these frequencies are registering much lower than when playing higher notes, say A 220 and A 440. This makes detecting the fundamental frequencies more difficult when lower notes are being played as it often appears as if I am playing the note an octave higher (the harmonic frequencies show up more prominently than the fundamental frequency for lower notes).

Can someone shed some light on this phenomenon? This doesn't appear to be due to the iPhone's mic because the same thing happens when testing on both my iMac and Mac Book. Is there a way of dealing with this issue using Superpowered's api so that the fundamental frequency can be detected when lower notes are being played?

Correction: I was testing a little more this morning with a guitar, and what I noticed is that for the E (82.4069) and F (87.3071) the fundamental frequencies (82.xxx and 87.xxx) register less prominently than the perfect fifth above those frequencies, B and B# respectively.

Maybe it is just due to the nature of the guitar as an instrument. Unfortunately I don't have a piano to test with. How do the responses look when playing the low notes on a piano?

Sean
  • 868
  • 9
  • 22
  • With which musical instrument did u try it? Does other instruments also give the same result? – Emre Önder Jul 21 '17 at 15:51
  • It was a guitar. That's a great question though. However, I don't have access to a piano. I have a ukulele and flute but they don't go low enough. Have you been able to test this with a piano? – Sean Jul 21 '17 at 16:04
  • I haven't tired any of it. I know some music knowledge and iOS and saw your question and tried to help :) – Emre Önder Jul 22 '17 at 11:10
  • Thanks. I'm experimenting now with different ways of boosting the lower frequencies when using my guitar. I'll hopefully find a way to test with piano soon. I'll probably use an on screen midi piano with a sound font. This should let me know if this is just an artifact of using a guitar. – Sean Jul 22 '17 at 13:34

1 Answers1

0

The sensitivity of the iPhone's microphone may be lower in that region: https://blog.faberacoustical.com/2009/ios/iphone/iphone-microphone-frequency-response-comparison/ That's why harmonics may be picked up at a higher volume.

Gabor Szanto
  • 1,329
  • 8
  • 12
  • This was definitely the cause. Right now my solution only detects the pitch, not the octave. I'm not sure what the solution to that is because I don't understand the other methods of doing this. Is there another solution using Superpowered that would detect the octave? I'm not sure what direction to head in because I'm only an amateur with DSP technology. – Sean Aug 29 '17 at 16:32
  • If the microphone doesn't pick up the lower frequencies, "recovering" those is not really possible. Perhaps equalise the microphone input first, with those typical frequency responses in mind. – Gabor Szanto Aug 29 '17 at 18:05
  • Thanks.. I spent quite a bit of time trying out different approaches to pseudo-recovery and your right, I never accomplished anything that was satisfactory. Thanks for the tip. – Sean Aug 29 '17 at 18:52