3

Recently I've bumped into next problem. I use CoreAudio AudioUnit (RemoteI/O) to play/record sound stream in an iOS app.

Sound stream which goes into audio unit is 2 channel LPCM, 16 bit, signed integer, interleaved (I also configure an output recording stream which is basically the same but has only one channel and 2 bytes per packet and frame).

I have configured my input ASBD as follows (I get no error when I set it and when I initialize unit):

ASBD.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked;
ASBD.mBytesPerPacket = 4;
ASBD.mFramesPerPacket = 1;
ASBD.mBytesPerFrame = 4;
ASBD.mChannelsPerFrame = 2;
ASBD.mBitsPerChannel = 16;

In my render callback function I get AudioBufferList with one buffer (as I understand, because the audio stream is interleaved).

I have a sample stereo file for testing which is 100% stereo with 2 obvious channels. I translate it into stream which corresponds to ASBD and feed to audio unit.

When I play sample file I hear only left channel.

I would appreciate any ideas why this happens. If needed I can post more code.

Update: I've tried to set

ASBD.mFormatFlags = kLinearPCMFormatFlagIsSignedInteger | kLinearPCMFormatFlagIsPacked | kLinearPCMFormatFlagIsNonInterleaved;
ASBD.mBytesPerPacket = 2;
ASBD.mFramesPerPacket = 1;
ASBD.mBytesPerFrame = 2;
ASBD.mChannelsPerFrame = 2;
ASBD.mBitsPerChannel = 16;

ASBD and I've got buffer list with two buffers. I deinterleaved my stream into 2 channels(1 channel for 1 buffer) and got the same result. I tried with headset and speaker on iPad (I know that speaker is mono).

user1264176
  • 1,118
  • 1
  • 9
  • 26
  • Would you mind posting your source code? How did you ddeinterleave your stream into 2 channels? – Surz Jun 30 '17 at 16:41

1 Answers1

3

Ok. So I've check my code and spotted that I use VoiceProcessingIO audio unit (instead of RemoteIO which is in the question) which is basically correct for my app since documentation says "The Voice-Processing I/O unit (subtype kAudioUnitSubType_VoiceProcessingIO) has the characteristics of the Remote I/O unit and adds echo suppression for two-way duplex communication. It also adds automatic gain correction, adjustment of voice-processing quality, and muting"

When I changed audio unit type to RemoteIO I've immediately got the stereo playback. I didn't have to change stream properties.

Basically VoiceProcessingIO audio unit downfalls to mono and disregards stream properties.

I've posted a question on Apple Developer forum regarding stereo output using VoiceProcessingIO audio unit but haven't got any answer yet.

It seems pretty logical for me to downfall to mono in order to do some signal processing like echo cancelation because iOS devices can record only mono sound without specific external accessories. Although this is not documented anywhere in Apple documentation. I've also come across a post of guy who claimed that stereo worked for VoiceProcessingIO AU prior to iOS5.0.

Anyway thanks for your attention. Any other comments on the matter would be greatly appreciated.

user1264176
  • 1,118
  • 1
  • 9
  • 26
  • I have same problem..I have a stereo file for which is 100% stereo with 2 obvious channels. When I play sample file I hear only one channel. Can you suggest me which code I need to write to be able to play one channel stereo? Thanks – Mr. Frank Jul 25 '13 at 11:54
  • You can take a look to this question. In the comments there is a link to source code. http://stackoverflow.com/questions/10823322/control-mono-playback-output-with-core-audio – user1264176 Jul 26 '13 at 07:50