I noticed there are audible sound gaps with audio units rendering when something heavy happens on UI main thread, for instance, first show of UIAlertView or some heavy ui controller.
Please take a look at the sample project. There is only one remote IO unit with output render callback, buffer duration is set to maximum possible - 92ms. When loading alert view, there is 200-300ms gap between render callbacks, even if render function is empty.
The question is: how do I reach Apple's MPMusicPlayerController rendering performance with Audio Units? Is that even possible or Apple's player is based on lower API? Looking through core-audio mailing lists got me nowhere.
Thanks in advance!