I'm making a metronome using AVAudioEngine
, AVAudioPlayerNode
, and AVAudioPCMBuffer
. The buffer is created like so:
/// URL of the sound file
let soundURL = Bundle.main.url(forResource: <filename>, withExtension: "wav")!
/// Create audio file
let audioFile = try! AVAudioFile(forReading: soundURL)
let audioFormat = audioFile.processingFormat
/// Create the buffer - what value to put for frameCapacity?
if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: ???) {
buffer.frameLength = audioFrameCount
try? audioFile.read(into: buffer)
return buffer
}
What value do I put for frameCapacity in the AVAudioPCMBuffer initializer?
The documentation says frameCapacity should be "The capacity of the buffer in PCM sample frames." What does that mean? Is this a static value or do you grab it from the audio file?