2

I'm making a metronome using AVAudioEngine, AVAudioPlayerNode, and AVAudioPCMBuffer. The buffer is created like so:

/// URL of the sound file
let soundURL = Bundle.main.url(forResource: <filename>, withExtension: "wav")!
/// Create audio file
let audioFile = try! AVAudioFile(forReading: soundURL)
let audioFormat = audioFile.processingFormat
    
/// Create the buffer - what value to put for frameCapacity?
if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: ???) {
    buffer.frameLength = audioFrameCount
    try? audioFile.read(into: buffer)
        
    return buffer
}

What value do I put for frameCapacity in the AVAudioPCMBuffer initializer?

The documentation says frameCapacity should be "The capacity of the buffer in PCM sample frames." What does that mean? Is this a static value or do you grab it from the audio file?

Miles
  • 487
  • 3
  • 12

1 Answers1

0

frameCapacity is the maximum number of frames that AVAudioPCMBuffer can hold. You don't have to use all of the frames. Consumers of AVAudioPCMBuffers should only ever consult frameLength frames, and frameLength <= frameCapacity. A capacity different from the length can be useful if you're processing audio in chunks of N frames, and for whatever reason you get a short read:

let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: N)

while readChunk(chunk) {
    let frameLength = min(buffer.frameCapacity, chunk.lengthInFrames)
    // copy frameLength frames into buffer.floatChannelData[0] or something
    buffer.frameLength = chunkLength // could be less than N
}

But if you're only ever going to store audioFrameCount frames (the length of your file?) in the buffer, then set frameCapacity to that:

let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount)
Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159