I am confused on these three concepts
I am working on audio project on ios eventhough I am newbee on objective c and c. I am trying to get realtime fft of the audio data .
During the process I have come to the point where I am very confused on these three concepts. -maximumFramesPerSlice, -inNumberFrames variable on rendering function -preferred I/O hardware buffer duration which is set on AVAudioSession
I set buffer duration to 100 ms. while I expect InNumberFrames to come 4410 samples/frames on 44.1khz sampling rate, it always brings 470-471 frames on each render(callback) . while I am confused on both of these, now another variable comes out. maximumFramesPerSlice ?
it's default value is 4096. in coreAudio glossary it says "slice: The number of frames requested and processed during one rendering cycle of an audio unit"
so why it comes 470 frames in each render.
I will be very thankful for any help