I've been using the following methods to convert an AVAudioPCMBuffer to [UInt8] and then [UInt8] back to an AVAudioPCMBuffer. The problem is that every single conversion is a total of 17640 bytes, which to stream over MultipeerConnectivity is a lot. In fact, I think my stream ends up reading data slower than data is coming in, since if I end the stream on one device I can see that the other continues to read data until it realizes that the stream has ended.
Here is my conversion of AVAudioPCMBuffer to [UInt8]. Credit for this answer goes to Rhythmic Fistman from this answer.
func audioBufferToBytes(audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
let srcLeft = audioBuffer.floatChannelData![0]
let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)
// initialize bytes by 0
var audioByteArray = [UInt8](repeating: 0, count: numBytes)
srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
audioByteArray.withUnsafeMutableBufferPointer {
$0.baseAddress!.initialize(from: srcByteData, count: numBytes)
}
}
return audioByteArray
}
And here is [UInt8] to AVAudioPCMBuffer
func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)
let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame
let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
audioBuffer.frameLength = frameLength
let dstLeft = audioBuffer.floatChannelData![0]
buf.withUnsafeBufferPointer {
let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
dstLeft.initialize(from: src, count: Int(frameLength))
}
return audioBuffer
}
Can anyone help me compress this data so that it is easier to send over a stream, and then decompress it so it can be played?