1
AudioComponentDescription defaultOutputDescription;
defaultOutputDescription.componentType = kAudioUnitType_Output;
defaultOutputDescription.componentSubType = kAudioUnitSubType_RemoteIO;
defaultOutputDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
defaultOutputDescription.componentFlags = 0;
defaultOutputDescription.componentFlagsMask = 0;

// Get the default playback output unit
AudioComponent defaultOutput = AudioComponentFindNext(NULL, &defaultOutputDescription);
NSAssert(defaultOutput, @"Can't find default output");

// Create a new unit based on this that we'll use for output
OSErr err = AudioComponentInstanceNew(defaultOutput, &toneUnit);
NSAssert1(toneUnit, @"Error creating unit: %hd", err);


// Set our tone rendering function on the unit
AURenderCallbackStruct input;
input.inputProc = RenderTone;
input.inputProcRefCon = self;
err = AudioUnitSetProperty(toneUnit, 
    kAudioUnitProperty_SetRenderCallback, 
    kAudioUnitScope_Input,
    0, 
    &input, 
    sizeof(input));

NSAssert1(err == noErr, @"Error setting callback: %hd", err);

// Set the format to 32 bit, single channel, floating point, linear PCM
const int four_bytes_per_float = 4;
const int eight_bits_per_byte = 8;
AudioStreamBasicDescription streamFormat;
streamFormat.mSampleRate = sampleRate;
streamFormat.mFormatID = kAudioFormatLinearPCM;
streamFormat.mFormatFlags =
    kAudioFormatFlagsNativeFloatPacked | kAudioFormatFlagIsNonInterleaved;
streamFormat.mBytesPerPacket = four_bytes_per_float;
streamFormat.mFramesPerPacket = 1;  
streamFormat.mBytesPerFrame = four_bytes_per_float;     
streamFormat.mChannelsPerFrame = 1; 
streamFormat.mBitsPerChannel = four_bytes_per_float * eight_bits_per_byte;

err = AudioUnitSetProperty (toneUnit,
      kAudioUnitProperty_StreamFormat,
      kAudioUnitScope_Input,
      0,
      &streamFormat,
      sizeof(AudioStreamBasicDescription));

Now I am using AudioUnit callback function to create a sound at different frquencies and amplitude, now I want the sound to move between ears like we do using pan property of AVAudioPlayer.For this I tried using the kMultiChannelMixerParam_Pan property using AudioUnitSetParameter ( toneUnit, kMultiChannelMixerParam_Pan, kAudioUnitScope_Output, 0, sender.value, 0 ) But it's not working for me.

  • 1
    it's very hard to read your code and you give little explanation as to what is going wrong or what you need yo accomplish (nil actually to be precise), but personally i wouldn't use a mono channel. i would use a stereo output and silence either the left or right as needed. – MDB983 Jun 11 '15 at 13:30

3 Answers3

0

Setting the pan value on a mixer unit doesn't do it? kMultiChannelMixerParam_Pan There is also a kAudioUnitType_Panner you can add.

Gene De Lisa
  • 3,628
  • 1
  • 21
  • 36
  • I have already tried this change, but its not having any effect on that. Here is what exactly I am doing. I have a slider with range from -1 to 1 for pan and I am sliding its value and doing this. - (IBAction)panChanged:(UISlider *)sender { OSStatus result = AudioUnitSetParameter ( toneUnit, kAudioUnitType_Panner, kAudioUnitScope_Output, 0, sender.value, 0 ); } – Rahul Narang Jun 12 '15 at 05:37
  • What is `toneUnit`? If it's a mixer, then the parameter you're setting needs to be `kMultiChannelMixerParam_Pan` – Gene De Lisa Jun 12 '15 at 13:36
0

I used the following code.

 AudioUnitSetParameter(<mixerUnit>,
                          kMultiChannelMixerParam_Pan,
                          kAudioUnitScope_Input,
                          0,
                          <panValue>,
                          0);

Audio unit type : kAudioUnitType_Mixer

Subtype :kAudioUnitSubType_MultiChannelMixer

Clement Prem
  • 3,112
  • 2
  • 23
  • 43
0

I've found that setting the pan on bus 0 does not work until after the mixer audio unit has been initialized. Setting volume on bus 0 works before this, and setting pan on bus 1 works before this. I do not know why. I am on iOS if that matters

robotcoder
  • 75
  • 1
  • 5