I have mic audio captured during an ARSession that I wish to pass to another VC and play back after the capture has taken place, but whilst the app is still running (and audio in memory).
The audio is currently captured as a single CMSampleBuffer and accessed through the didOutputAudioSampleBuffer
ARSessionDelegate
method.
I've worked with audio files and AVAudioPlayer before, but am new to CMSampleBuffer.
Is there a way of taking the raw buffer as is and playing it? If so, which classes enable this? Or does it need to be rendered/converted into some other format or file first?
This is the format description of the data in the buffer:
mediaType:'soun'
mediaSubType:'lpcm'
mediaSpecific: {
ASBD: {
mSampleRate: 44100.000000
mFormatID: 'lpcm'
mFormatFlags: 0xc
mBytesPerPacket: 2
mFramesPerPacket: 1
mBytesPerFrame: 2
mChannelsPerFrame: 1
mBitsPerChannel: 16 }
cookie: {(null)}
ACL: {Mono}
FormatList Array: {
Index: 0
ChannelLayoutTag: 0x640001
ASBD: {
mSampleRate: 44100.000000
mFormatID: 'lpcm'
mFormatFlags: 0xc
mBytesPerPacket: 2
mFramesPerPacket: 1
mBytesPerFrame: 2
mChannelsPerFrame: 1
mBitsPerChannel: 16 }}
}
extensions: {(null)}
Any guidance appreciated, as Apple's docs aren't clear on this matter, and related questions on SO deal more with a live-stream of audio than capture and subsequent playback.