0

i'm recording Video/Audio using AVAssetwriter and want to be able to write silent Sample Buffers. I'm not that experienced in CoreAudio, so i'm having trouble coming up with a working solution.

The idea is to keep recording video when an audio device is disconnected until it's reconnected. The problem is, that AVFoundation somehow push the audio to the front, so the resulting movie file is massivley out of sync.

My current implementation tries to create an empty/silent CMSampleBuffer to place in between segments where there's no audio device connected.

if (audioOutput == captureOutput && audioWriterInput.readyForMoreMediaData) {
    if (needToFillAudioGap) {

        CMTime temp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMItemCount numSamples = temp.value - lastAudioDisconnect.value;
        OSStatus status;
        CMBlockBufferRef bbuf = NULL;
        CMSampleBufferRef sbuf = NULL;
        int nchans = 2;
        size_t buflen = numSamples * nchans * sizeof(float);


        NSMutableData* data = [NSMutableData dataWithLength:buflen];
        void* samples = [data mutableBytes];
        status = CMBlockBufferCreateWithMemoryBlock(
                                                    kCFAllocatorDefault,
                                                    samples,
                                                    buflen,
                                                    kCFAllocatorNull,
                                                    NULL,
                                                    0,
                                                    buflen,
                                                    0,
                                                    &bbuf);
        if (status != noErr) {
            NSLog(@"CMBlockBufferCreateWithMemoryBlock error: %d", (int)status);
            return;
        }
        CMBlockBufferRef blockBufferContiguous;
        status = CMBlockBufferCreateContiguous(kCFAllocatorDefault,
                                               bbuf,
                                               kCFAllocatorNull,
                                               NULL,
                                               0,
                                               buflen,
                                               0,
                                               &blockBufferContiguous);

        CFRelease(bbuf);
        if(status != noErr)
        {
            printf("CMBlockBufferCreateContiguous failed with error %d\n", (int)status);
            return;
        }

        status = CMAudioSampleBufferCreateReadyWithPacketDescriptions(kCFAllocatorDefault, blockBufferContiguous, CMSampleBufferGetFormatDescription(sampleBuffer), numSamples, lastAudioDisconnect, NULL, &sbuf);
        CFRelease(blockBufferContiguous);

        if (status != noErr) {
            NSLog(@"CMSampleBufferCreate error: %d", (int)status);
            return;
        }
        BOOL r = [audioWriterInput appendSampleBuffer:sbuf];
        if (!r) {
            NSLog(@"appendSampleBuffer error: %d", (int)status);
            return;
        }
        CFRelease(sbuf);
        NSLog(@"Filling Audio Gap");
        needToFillAudioGap = false;
    } else {

        if ([audioWriterInput appendSampleBuffer:sampleBuffer])
            lastAudioDisconnect = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    }
}

The sampleBuffer at the top is the first samplebuffer after the audio device is reconnected. That gives me information on how long the gap I have to fill should be. LastAudioDisconnect always holds the Presentation Timestamp from the last audio samplebuffer that was written.

With Guard Malloc enabled the program crashes with: CrashIfClientProvidedBogusAudioBufferList

EDIT: With Guard Malloc disabled i am able to reconnect the audio device multiple times while recording and when i stop to record, the gap is there without problems.

The problem then is that i only have a couple a minutes to stop the recording after reconnecting a device, because the AVAssetWriter fails randomly with the error code 11800 (AVErrorUnknown).

Legi
  • 1
  • 3

1 Answers1

0

The error is because your created CMSampleBuffer is too long.

The created CMSampleBuffer should be the same length (contain the same number of samples) as the proceeding sample buffers that are filled with live audio. You can create multiple silent buffers, if necessary, and deliver them at the same rate as the live audio buffers.

hotpaw2
  • 70,107
  • 14
  • 90
  • 153
  • Thanks for your answer. To test your it i created a new variable called lastAudioNumSamples to store that information from the last samplebuffer that was written and set numSamples = lastAudioNumSamples in my example. With that change it's random whether or not it crashes every time i reconnect the audio device while recording. It feels like it's getting better. I edited my question to add a bit more information about my previous tries. – Legi Dec 05 '15 at 22:27