0

-Hope everyone is having a lovely day / evening, wherever you may be! :)

Apologies in advance if this question is not formatted / structured properly. This is my first posting and not all of the guided mode seems to be accessible using screen-reading technology. So my apologies.

I am attempting to create a small scene kit 3D audio demo with a twist. I am needing to include some audio processing of spatialized audio.

I.E. I need to add the Varispeed audio unit (or something that can produce a similar effect) to audio which is being played from an SCNAudioPlayer attached to an SCNNode. I am only needing to slow down or speed up the audio with no pitch compensation necessary.

I have been scouring Stack and the rest of the web now for over a week in my spare time trying to locate the magic but so far it is still alluding me. :)

Just for clarity, I have checked:

How to change audio pitch during playback? (Swift 4)

Attaching AudioUnit effects to SCNAudioSource nodes in SceneKit

-and these two in hopes that there may be anything that might remotely help…

AVAudioPlayerNode does not play sound

Changing the volume of an SCNAudioPlayer in real time - Swift

-And numerous other pages from Stack and other forums (Apple etc) but no examples seem to apply to my specific use-case directly.

There have been some excellent answers but so far these seem to involve creating separate instances of AVAudioEngine and building a graph manually. What I would like to do is to use the existing instance of AVAudioEngine that Scene Kit provides and simply add certain nodes to it with processed audio. -And it is just not clear to me if this is possible or not.

I can get the standard scene kit 3D positional audio to work just fine and can also get an AVAudioPlayerNode connected with the currently instantiated AVAudioEngine and have it processed with the Varispeed audio unit. However, the audio is not spatialized.

I assume this is because (as in the below code) I am connecting the AVAudioPlayerNode to the Varispeed unit and then connecting that to the engine’s main mixer node. However, I can find no other examples of how to configure an AVAudioPlayerNode to do what I need.

So even though I create an SCNAudioPlayer from an AVAudioPlayerNode (which is being processed via varispeed) and then add that SCNAudioPlayer to the SCNNode, it will still not be rendered in 3D.

The audio file plays fine and can be processed but is not spatialized. Also, just as a note, the file is in mono, which is what scene kit calls for in order for it to be spatialized in 3D.

My other thought was to try adding the output of the Varispeed unit to the engine’s audioEnvironmentNode instead of the engine’s main mixer node but this will not work for me either. It looks like the app simply hangs when I attempt to connect the Varispeed unit to the engine’s audioEnvironmentNode.

I get no feedback from the console as to what might be going on and the app throws no exceptions. However, it simply seemingly executes up to the point where the Varispeed node is connected to the engine’s audioEnvironmentNode and then appears to hang. -Which offers me no help in solving this. :) I try to verify if the audioEnvironmentNode is nil but get no result either way on it. I.E. the console prints absolutely nothing in regard to the audioEnvironmentNode. -Go figure…

I do intend to pursue this further but need a bit of hand-holding at this point. :)

My last thought has been to see if I can add an AVAudioPlayerNode to an SCNNode directly but this does not seem to be possible.

Finally, here is my code below. Once again, apologies for any unusual formatting.

Thanks for taking a look at this and any help or suggestions from anyone will sure be appreciated! If I am missing something super obvious here, please be gentle! :) I dare say the Apple docs in regard to the relationships between SceneKit and AVFoundation are not super intuitive to me at least…

Thanks and Cheers! :)

// Creating an AVAudioPlayerNode with processing via Varispeed
let saucerSoundAudioNode = load(file: "ablue", ofType: "wav")

// Creating an SCNAudioPlayer from the AVAudioPlayerNode
let saucerSoundPlayer = SCNAudioPlayer(avAudioNode: saucerSoundAudioNode)

// Adding the SCNAudioPlayer to an SCNNode
  saucerNode.addAudioPlayer(saucerSoundPlayer)

// this function loads audio data and creates an aVAudioPlayerNode
// It then connects that node to an instance of AVAudioUnitVarispeed
// The Varispeed node is then connected to the engine's main mixer node
// The function then returns the AVAudioPlayerNode
func load(file name: String, ofType type: String) -> AVAudioPlayerNode {

   let audioPlayer = AVAudioPlayerNode()
    audioPlayer.volume = 1.0

    let path = Bundle.main.path(forResource: "art.scnassets/"+name, ofType: type)!
    let url = NSURL.fileURL(withPath: path)

    let file = try? AVAudioFile(forReading: url)

    let buffer = AVAudioPCMBuffer(pcmFormat: file!.processingFormat, frameCapacity: AVAudioFrameCount(file!.length))

    do {
        try file!.read(into: buffer!)
    } catch _ {

print("file could not be read into buffer")

    }

print("file has been read")

let engine = scnView.audioEngine let pitch = AVAudioUnitVarispeed()

engine.attach(audioPlayer) engine.attach(pitch)

      engine.connect(audioPlayer, to: pitch, format: file?.processingFormat)

engine.connect(pitch, to: engine.mainMixerNode, format: file?.processingFormat)

// Replacing engine.mainMixerNode with scnView.audioEnvironmentNode seems to cause app to hang // Console gives no indication of any error but execution seems to stop

    audioPlayer.scheduleBuffer(buffer!, at: nil, options: AVAudioPlayerNodeBufferOptions.loops, completionHandler: nil)

return audioPlayer

}
ModGirl
  • 31
  • 4

1 Answers1

2

I've found an answer to my question so wanted to post this in case others may be having the same / similar issues with this.

It turns out that I needed to explicitly set the audioListener as pointOfView before I could access the audioEnvironmentNode of the AVAudioEngine instance that SceneKit instantiates.

Also, I needed to set the camera on the audioListener before the listener is added to the scene, rather than afterward.

According to Apple's documentation though, if there is only one camera in the scene, which there is in my case, the node that the camera is set on is supposed to automatically default to becoming the pointOfView and thereby also default to being the audioListener.

In practice though, this does not seem to be the case. -And, since the pointOfView did not seem to be associated with the node that my camera was associated with, it seemed that I could not access the current audio environment node.

So the code I have listed in my question does work but with a minor tweak.

let engine = scnView.audioEngine
let environmentNode = scnView.audioEnvironmentNode
let pitch = AVAudioUnitVarispeed()

engine.attach(audioPlayer)
engine.attach(environmentNode)
engine.attach(pitch)

engine.connect(audioPlayer, to: pitch, format: file?.processingFormat)
engine.connect(pitch, to: environmentNode, format: file?.processingFormat)
engine.connect(environmentNode, to: engine.mainMixerNode, format: nil)

Once this is done, then as before, all I need to do is create the SCNAudioPlayer from the returned AVAudioPlayerNode and associate the SCNAudioPlayer with the SCNNode and all is well! :) The audio is presented in 3D based on the SCNNode's position and is easily modifiable using the parameters of the connected AVAudioUnit.

So hope this helps others and please have a wonderful day / weekend!

Cheers! :)

ModGirl

ModGirl
  • 31
  • 4