Recently I've bumped into next problem. I use CoreAudio AudioUnit (RemoteI/O) to play/record sound stream in an iOS app.
Sound stream which goes into audio unit is 2 channel LPCM, 16 bit, signed integer, interleaved (I also configure an output recording stream which is basically the same but has only one channel and 2 bytes per packet and frame).
I have configured my input ASBD as follows (I get no error when I set it and when I initialize unit):
ASBD.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked;
ASBD.mBytesPerPacket = 4;
ASBD.mFramesPerPacket = 1;
ASBD.mBytesPerFrame = 4;
ASBD.mChannelsPerFrame = 2;
ASBD.mBitsPerChannel = 16;
In my render callback function I get AudioBufferList with one buffer (as I understand, because the audio stream is interleaved).
I have a sample stereo file for testing which is 100% stereo with 2 obvious channels. I translate it into stream which corresponds to ASBD and feed to audio unit.
When I play sample file I hear only left channel.
I would appreciate any ideas why this happens. If needed I can post more code.
Update: I've tried to set
ASBD.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked | kLinearPCMFormatFlagIsNonInterleaved;
ASBD.mBytesPerPacket = 2;
ASBD.mFramesPerPacket = 1;
ASBD.mBytesPerFrame = 2;
ASBD.mChannelsPerFrame = 2;
ASBD.mBitsPerChannel = 16;
ASBD and I've got buffer list with two buffers. I deinterleaved my stream into 2 channels(1 channel for 1 buffer) and got the same result. I tried with headset and speaker on iPad (I know that speaker is mono).