0

I'm trying to play music from byte array which is coming from the network in pcmInt16 data format.

// formats
let format1 = AVAudioFormat(commonFormat: AVAudioCommonFormat.pcmFormatFloat32, sampleRate: 48000, channels: 1, interleaved: false)!
let format2 = AVAudioFormat(commonFormat: AVAudioCommonFormat.pcmFormatInt16, sampleRate: 48000, channels: 1, interleaved: false)!

// byte array buffer
var byteArray: [Int16]! // one packet size is 512

...
// 1. create / attach / connect engine
engine.prepare()
try! engine.start()
engine.attach(playerNode)
engine.connect(playerNode, to: engine.mainMixerNode, format: format1)

// 2. fill byteArray with music stream // int16 48kHz 32bit
...

// 3.
var len = 512
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: format2, frameCapacity: AVAudioFrameCount(len))!

// HERE
// How to set the first 512 data from byteArray ?
playerNode.scheduleBuffer(pcmBuffer, completionHandler: nil)

How to set the first 512 data from byteArray ? i tried something like this but its not working: memcpy(pcmBuffer.audioBufferList.pointee.mBuffers.mData, byteArray[0..<512], len * 2)

szuniverse
  • 1,076
  • 4
  • 17
  • 32

2 Answers2

4

The AVAudioMixerNode is good for sampleRate conversions, but for broad format changes like Int16 to Float, you're probably better off converting yourself. For performance, I suggest using vDSP Accelerate.

import Cocoa
import AVFoundation
import Accelerate
import PlaygroundSupport


let bufferSize = 512
let bufferByteSize = MemoryLayout<Float>.size * bufferSize

var pcmInt16Data: [Int16] = []
var pcmFloatData = [Float](repeating: 0.0, count: bufferSize) // allocate once and reuse


// one buffer of noise as an example
for _ in 0..<bufferSize {
    let value = Int16.random(in: Int16.min...Int16.max)
    pcmInt16Data.append(value)
}


let engine = AVAudioEngine()
let player = AVAudioPlayerNode()

let audioFormat = AVAudioFormat(standardFormatWithSampleRate: 48_000.0, channels: 1)!

let mixer = engine.mainMixerNode

engine.attach(player)
engine.connect(player, to: mixer, format: audioFormat)

engine.prepare()

do {
    try engine.start()
} catch {
    print("Error info: \(error)")
}

player.play()

if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: UInt32(bufferSize)) {
    
    let monoChannel = buffer.floatChannelData![0]
    
    // Int16 ranges from -32768 to 32767 -- we want to convert and scale these to Float values between -1.0 and 1.0
    var scale = Float(Int16.max) + 1.0
    vDSP_vflt16(pcmInt16Data, 1, &pcmFloatData, 1, vDSP_Length(bufferSize)) // Int16 to Float
    vDSP_vsdiv(pcmFloatData, 1, &scale, &pcmFloatData, 1, vDSP_Length(bufferSize)) // divide by scale
    
    memcpy(monoChannel, pcmFloatData, bufferByteSize)
    buffer.frameLength = UInt32(bufferSize)
    player.scheduleBuffer(buffer, completionHandler: nil) // load more buffers in the completionHandler
    
}


PlaygroundPage.current.needsIndefiniteExecution = true

If instead you'd like to play an AVAudioFile, use the AVAudioPlayerNode.scheduleFile() and .scheduleSegment methods rather than trying to read the Int16 data directly from a WAV/AIFF. You'll want to pay attention to the AVAudioFile.processingFormat parameter and use that for the format of the connection from the player to the mixer.

import Cocoa
import PlaygroundSupport
import AVFoundation


let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
    
let playEntireFile = true

func playLocalFile() {

    // file needs to be in ~/Documents/Shared Playground Data
    let localURL = playgroundSharedDataDirectory.appendingPathComponent("MyAwesomeMixtape6.aiff")
    guard let audioFile = try? AVAudioFile(forReading: localURL) else { return }
    let audioFormat = audioFile.processingFormat

    let mixer = engine.mainMixerNode

    engine.attach(player)
    engine.connect(player, to: mixer, format: audioFormat)

    engine.prepare()

    do {
        try engine.start()
    } catch {
        print("Error info: \(error)")
    }

    player.play()
    
    if playEntireFile {
        
        player.scheduleFile(audioFile, at: nil, completionHandler: nil)
            
    } else { // play segment
        
        let startTimeSeconds = 5.0
        let durationSeconds = 2.0
        
        let sampleRate = audioFormat.sampleRate
        let startFramePostion = startTimeSeconds * sampleRate
        let durationFrameCount = durationSeconds * sampleRate
        
        player.scheduleSegment(audioFile, startingFrame: AVAudioFramePosition(startFramePostion), frameCount: AVAudioFrameCount(durationFrameCount), at: nil, completionHandler: nil)
        
    }
    
}

playLocalFile()


PlaygroundPage.current.needsIndefiniteExecution = true

For remote files, try AVPlayer.

import Cocoa
import AVFoundation
import PlaygroundSupport


var player: AVPlayer?

func playRemoteFile() {

    guard let remoteURL = URL(string: "https://ondemand.npr.org/anon.npr-mp3/npr/me/2020/03/20200312_me_singapore_wins_praise_for_its_covid-19_strategy_the_us_does_not.mp3"
        ) else { return }
    
    player = AVPlayer(url: remoteURL)

    player?.play()

}

playRemoteFile()

PlaygroundPage.current.needsIndefiniteExecution = true
Sirop4ik
  • 4,543
  • 2
  • 54
  • 121
lentil
  • 664
  • 5
  • 5
  • thanks, this worked fine, I am still learning the core audio and now I am trying to resample the audio from 44.1 khz to 48 khz, https://stackoverflow.com/questions/60711929/change-sample-rate-with-audioconverter can you help me a little bit here? i attached a sample code under the question. thanks in advance – szuniverse Mar 16 '20 at 19:13
  • Hey @lentil, I am very interested in the first example of feeding a PCM buffer with data. In my case, I need to start a local file playing at a specified time from the beginning of the track and play for a specified amount of time. However, at any point a "Play to end" checkbox could be selected, therefore the PCMBuffer would continue to be fed until the end of file is reached. Any chance you could elaborate on how I could achieve this using your first example? I can create another question, if you'd like. – SouthernYankee65 Sep 15 '20 at 17:29
  • Rather than deal with the samples directly, it might be easier to draw from the second example that uses AVAudioPlayerNode.scheduleSegment. You can keep scheduling segments while the box is checked. Feel free to make another question if you like - easier to post code, etc. And you're more likely to get a broader range of suggestions. – lentil Sep 15 '20 at 21:10
  • And on adding additional segments - you'd add them in the completion handler for the scheduleSegment method. Just need to plan a bit on your approach to dispatch queues and not directly access the UI state from the completion block. – lentil Sep 15 '20 at 21:20
0

First of all, you should better not use Implicitly Unwrapped Optionals as far as you can.

var byteArray: [Int16] = [] // one packet size is 512

As far as I can see from your code shown, there is no need to make byteArray Optional.


And How to set the first 512 data from byteArray ?

Your code would work with a little modification:

pcmBuffer.frameLength = AVAudioFrameCount(len)
memcpy(pcmBuffer.audioBufferList.pointee.mBuffers.mData, byteArray, len * 2)

Or you can work with int16ChannelData:

if let channelData = pcmBuffer.int16ChannelData {
    memcpy(channelData[0], byteArray, len * MemoryLayout<Int16>.stride)
    pcmBuffer.frameLength = AVAudioFrameCount(len)
} else {
    print("bad format")
}

You may want to load non-first parts of your byteArray, but that is another issue.

OOPer
  • 47,149
  • 6
  • 107
  • 142
  • Thanks! I made an example here: https://drive.google.com/file/d/1apISFQXBEnB-cYKuSFnnxHkwoYLnFa-E if you check the ViewController.start() method I made 3 different solution, the first one with AVAudioFile working perfectly but the others two is weird. The sound is faster and higher than it should be. What am I doing wrong? – szuniverse Mar 11 '20 at 21:28
  • It is not clear what's going on as I cannot see the actual data. One thing sure is that wav file is not a simple sequence of PCM samples, so `Data` of the file cannot be in the right _pcmInt16 data format_. If you want to know how to load PCM samples from wav file, that is a different issue and you should better start a new thread. – OOPer Mar 12 '20 at 00:17