1

I have mic audio captured during an ARSession that I wish to pass to another VC and play back after the capture has taken place, but whilst the app is still running (and audio in memory).

The audio is currently captured as a single CMSampleBuffer and accessed through the didOutputAudioSampleBuffer ARSessionDelegate method.

I've worked with audio files and AVAudioPlayer before, but am new to CMSampleBuffer.

Is there a way of taking the raw buffer as is and playing it? If so, which classes enable this? Or does it need to be rendered/converted into some other format or file first?

This is the format description of the data in the buffer:

mediaType:'soun' 
    mediaSubType:'lpcm' 
    mediaSpecific: {
        ASBD: {
            mSampleRate: 44100.000000 
            mFormatID: 'lpcm' 
            mFormatFlags: 0xc 
            mBytesPerPacket: 2 
            mFramesPerPacket: 1 
            mBytesPerFrame: 2 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 16     } 
        cookie: {(null)} 
        ACL: {Mono}
        FormatList Array: {
            Index: 0 
            ChannelLayoutTag: 0x640001 
            ASBD: {
            mSampleRate: 44100.000000 
            mFormatID: 'lpcm' 
            mFormatFlags: 0xc 
            mBytesPerPacket: 2 
            mFramesPerPacket: 1 
            mBytesPerFrame: 2 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 16     }} 
    } 
    extensions: {(null)}

Any guidance appreciated, as Apple's docs aren't clear on this matter, and related questions on SO deal more with a live-stream of audio than capture and subsequent playback.

MightyMeta
  • 599
  • 4
  • 14

2 Answers2

1

It's possible to pass mic to audio engine in parallel with recording, at minimum lag:

let audioEngine = AVAudioEngine()
...
self.audioEngine.connect(self.audioEngine.inputNode,
    to: self.audioEngine.mainMixerNode, format: nil)
self.audioEngine.start()

If sample buffer use is important -- Roughly, it can be done with conversion into PCM buffer:

import AVFoundation

extension AVAudioPCMBuffer {
static func create(from sampleBuffer: CMSampleBuffer) -> AVAudioPCMBuffer? {
    
    guard let description: CMFormatDescription = CMSampleBufferGetFormatDescription(sampleBuffer),
          let sampleRate: Float64 = description.audioStreamBasicDescription?.mSampleRate,
          let channelsPerFrame: UInt32 = description.audioStreamBasicDescription?.mChannelsPerFrame /*,
     let numberOfChannels = description.audioChannelLayout?.numberOfChannels */
    else { return nil }
    
    guard let blockBuffer: CMBlockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer) else {
        return nil
    }
    
    let samplesCount = CMSampleBufferGetNumSamples(sampleBuffer)
    
    //let length: Int = CMBlockBufferGetDataLength(blockBuffer)
    
    let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: sampleRate, channels: AVAudioChannelCount(1), interleaved: false)
    
    let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat!, frameCapacity: AVAudioFrameCount(samplesCount))!
    buffer.frameLength = buffer.frameCapacity
    
    // GET BYTES
    var dataPointer: UnsafeMutablePointer<Int8>?
    CMBlockBufferGetDataPointer(blockBuffer, atOffset: 0, lengthAtOffsetOut: nil, totalLengthOut: nil, dataPointerOut: &dataPointer)
    
    guard var channel: UnsafeMutablePointer<Float> = buffer.floatChannelData?[0],
          let data = dataPointer else { return nil }
    
    var data16 = UnsafeRawPointer(data).assumingMemoryBound(to: Int16.self)
    
    for _ in 0...samplesCount - 1 {
        channel.pointee = Float32(data16.pointee) / Float32(Int16.max)
        channel += 1
        for _ in 0...channelsPerFrame - 1 {
            data16 += 1
        }
        
    }
    
    return buffer
}
}


 class BufferPlayer {

let audioEngine = AVAudioEngine()
let player = AVAudioPlayerNode()

deinit {
    self.audioEngine.stop()
}

init(withBuffer: CMSampleBuffer) {
    
    self.audioEngine.attach(self.player)
    
    self.audioEngine.connect(self.player,
                             to: self.audioEngine.mainMixerNode,
                             format: AVAudioPCMBuffer.create(from: withBuffer)!.format
    )
    
    _ = try? audioEngine.start()
}

func playEnqueue(buffer: CMSampleBuffer) {
    guard let bufferPCM = AVAudioPCMBuffer.create(from: buffer) else { return }
    
    self.player.scheduleBuffer(bufferPCM, completionHandler: nil)
    if !self.player.isPlaying { self.player.play() }
}

}
Niko
  • 161
  • 6
0

It seems that the answer is no, you can't simply save and play back raw buffer audio, it needs to be converted to something more persistent first.

Looks like the main way to do this is to use AVAssetWriter to save the buffer data as an audio file, for playback later using AVAudioPlayer.

MightyMeta
  • 599
  • 4
  • 14