0

I am using audio mixer created by Apple with it I am starting the Audio Unit recorder to record.

Playing audio files using mixer and recording using Audio Unit will be a simultaneous actions, I am initializing the Audio Session using AudioSessionInitialize.

And I am initilizing this for the first time that time mixer and recorder works fine but when I try to playing audio files and recording from other view controller that time mixer does not produce any volume.

I am kind of stuck here, anyone faced such issue please help me out here.

This is mixer rendering callback

 static OSStatus renderInput(void *inRefCon,
                                AudioUnitRenderActionFlags *ioActionFlags,
                                const AudioTimeStamp *inTimeStamp,
                                UInt32 inBusNumber,
                                UInt32 inNumberFrames,
                                AudioBufferList *ioData){
        SoundBufferPtr sndbuf = (SoundBufferPtr)inRefCon;
        UInt32 sample = sndbuf[inBusNumber].sampleNum;
        UInt32 bufSamples = sndbuf[inBusNumber].numFrames;
        Float32 *in = sndbuf[inBusNumber].data;
        Float32 *outA = (Float32 *)ioData->mBuffers[0].mData;
        for (UInt32 i = 0; i < inNumberFrames; ++i) {
            if(sample < bufSamples && inBusNumber > kNumOfRecordStartBus)
                outA[i] = in[sample++];
            else if(inBusNumber < kNumOfRecordStartBus+1){
                outA[i] = in[sample++];
                if (sample > bufSamples) {
                    sample = 0;
                }
            }
        }
        if(sndbuf[inBusNumber].numFrames == sample)
            [[NSNotificationCenter defaultCenter]
             postNotificationName: @"AUDIOFILENOTLOOPING"
             object: @{@"BUSNUMBER":[NSString stringWithFormat:@"%d",inBusNumber]}];
        else
            sndbuf[inBusNumber].sampleNum = sample;
        return noErr;
    }

And this is recording callback:

 OSStatus recordCallback(void                              *inRefCon,
                            AudioUnitRenderActionFlags        *ioActionFlags,
                            const AudioTimeStamp              *inTimeStamp,
                            UInt32                            inBusNumber,
                            UInt32                            inNumberFrames,
                            AudioBufferList                   *ioData){
        AudioBufferList bufferList;
        UInt16 numSamples=inNumberFrames*kChannels;
        UInt16 samples[numSamples];
        memset (&samples, 0, sizeof (samples));
        bufferList.mNumberBuffers = 1;
        bufferList.mBuffers[0].mData = samples;
        bufferList.mBuffers[0].mNumberChannels = kChannels;
        bufferList.mBuffers[0].mDataByteSize = numSamples*sizeof(UInt16);
        AudioUnitRecorder* this = (__bridge AudioUnitRecorder *)inRefCon;
        CheckError(AudioUnitRender(this->mAudioUnit,
                                   ioActionFlags,
                                   inTimeStamp,
                                   kInputBus,
                                   inNumberFrames,
                                   &bufferList),"AudioUnitRender failed");

        // Now, we have the samples we just read sitting in buffers in bufferList
        ExtAudioFileWriteAsync(this->mAudioFileRef, inNumberFrames, &bufferList);
        return noErr;
    }
  • 1
    Please provide a minimal self-contained code example. – marko Sep 15 '15 at 13:59
  • Hope this is what you wanted? – Tech_Intelliswift Sep 21 '15 at 10:14
  • Thanks - that's a much better question. – marko Sep 21 '15 at 11:52
  • So do you have anything to say because i am kind of stuck on this? – Tech_Intelliswift Sep 22 '15 at 10:46
  • The fact that you're using this from several ViewControllers suggest to me that it might an object lifetime issue. Remember that ViewControllers are not necessarily destroyed when they are hidden. You want to be very explicit in initialisation and clean-up of resources such as this which are singletons. Much as I hate singletons, that's probably what you need here to manage audio. Also, audio render handlers should never block. The call through NSNotificaitonCentre has potential to do so and is a bad idea. Post a block to a GCD queue instead here. – marko Sep 22 '15 at 11:00
  • Are the other ViewControllers of the same type that creates the audio units? What are you doing with the other ViewControllers? – dave234 Sep 22 '15 at 12:41
  • Thanks Dave and marko...your comments were not exact solution but somehow it lead me to solve the issue. – Tech_Intelliswift Sep 24 '15 at 13:39

0 Answers0