Questions tagged [avaudioengine]

Use this tag when your question is about the AVAudioEngine class, which is part of the AVFoundation framework.

AVAudioEngine is part of the AVFoundation framework for Apple platforms. The class provides some of the more complex audio processing functionalities within the framework, by grouping together connected audio node objects that provide functionality for creating and processing audio signals and IO.

The AVAudioEngine API can be found here.

Related tags:

535 questions
3
votes
0 answers

How does toggling AVAudioSession active state affect the AVAudioPlayerNode

Every time the AVAudioSession category is re-activated ( after being inactivated) and the audioengine is restarted (calling stop and play), the output from audioplayer node seems to ignore the audio session category, until explicitly connecting the…
Anirudh R
  • 31
  • 3
3
votes
1 answer

I can start and stop AVAudioEngine but can not restart

I am using AVAudioEngine to make some changes on playing music like changing pitch and gain at some frequencies (equalizer) and get audio levels. It all works now. When I stop and tried to restart AVAudioEngine I faced a problem it does not start as…
Hope
  • 2,096
  • 3
  • 23
  • 40
3
votes
1 answer

Is it possible to make a AUv3 extension of type 'aufc' (kAudioUnitType_FormatConverter)?

I wrote a time stretch algorithm that sounds much better than AvAudioUnitTimePitch. I was making an iOS app around it using AVAudioEngine, thinking I could insert my algorithm into a AUv3 extension, and simply replace AvAudioUnitTimePitch. However,…
Steve M
  • 9,296
  • 11
  • 49
  • 98
3
votes
2 answers

AVAudioFile write not working Swift iOS 14

I have the following function: private func getPCMBuffer(utterance: AVSpeechUtterance, completion: @escaping (Result) -> Void ) { speechSynthesizer.write(utterance) { (buffer: AVAudioBuffer)…
Ali
  • 31
  • 3
3
votes
0 answers

Swift: How to convert AVAudioEngine's stereo output to mono

I´m trying to convert the output of my AVAudioEngine from stereo to mono. I want to be able to send the converted mono signal to the left or right headphone or speaker using AVAudioPlayerNode's pan property. engine.connect(audioFilePlayer[i], to:…
3
votes
1 answer

AVAudioEngine audio stops when switching the audio output device

I have this simple Swift code that uses an AVAudioEngine + AVAudioPlayerNode to play an audio file on loop. When I start the app the audio plays on the laptop speakers. If I switch my computer's output to a HomePod mini, the audio on the laptop…
Marin Todorov
  • 6,377
  • 9
  • 45
  • 73
3
votes
1 answer

AVAudioEngine Not Getting Microphone Input

First ever stackoverflow question here so bear with me please! In the process of designing a larger audio program for MacOS, I'm trying to create a test application that can simply take audio from any system audio input and send it to any output. To…
Ben Betts
  • 31
  • 2
3
votes
1 answer

AVAudioEngine: HALC_ShellObject errors, no audio. What can I even look for?

In an iOS app running on the Mac via Mac Catalyst, I'm using AVAudioEngine to play an audio file from the bundle. That works totally fine on my iOS devices, but not on the Mac. The code is pretty simple. Both my existing project and the fresh…
Benjamin Schmidt
  • 1,051
  • 13
  • 29
3
votes
1 answer

Stream audio with Swift

I'm developing an application that should record a user's voice and stream it to a custom device via the MQTT protocol. The audio specification for the custom device: little-endian, unsigned, 16-bit LPCM at 8khz sample rate. Packets should be 1000…
SunnyBunny27
  • 83
  • 2
  • 8
3
votes
1 answer

How can mp3 data in memory be loaded into an AVAudioPCMBuffer in Swift?

I have a class method to read an mp3 file into an AVAudioPCMBuffer as follows: private(set) var fullAudio: AVAudioPCMBuffer? func initAudio(audioFileURL: URL) -> Bool { var status = true do { let audioFile = try…
gimbal
  • 63
  • 8
3
votes
0 answers

How to Play PCM 16 bit using AVAudioPlayerNode?

i have app decoding stream audio over network from opus to pcm, the out data is PCM 16 bit, i decode it successfully, but when i try to play it using AVAudioEngine it fail, i can not change the sample rate from 44100 to 48000 which opus…
Ahmed
  • 112
  • 2
  • 12
3
votes
1 answer

AVAudioEngineImpl::IOUnitConfigurationChanged() Crash

I use AudioEngine. In crash reports I see some random EXC_BREAKPOINT crashes with this stacktrace Exception Type: EXC_BREAKPOINT (SIGTRAP) Exception Codes: 0x0000000000000001, 0x00000001896df0e4 Termination Signal: Trace/BPT trap: 5 …
Martin Vandzura
  • 3,047
  • 1
  • 31
  • 63
3
votes
1 answer

How to change AVAudioPlayerNode format when user switches headphones

I need to play an arbitrary tone that is generated based on the sample rate of the output device. User is required to connect wired or wireless headphones in order to listen to the audio. Since modern headphones can have native sample rate of 44100…
Roman Samoilenko
  • 932
  • 12
  • 25
3
votes
1 answer

swift AVAudioEngine and AVAudioSinkNode sampleRate convert

I have been having issues with this for a while now, and have written the following swift file that can be run on as the main view controller file for an app. Upon execution, it will play a short blast of a 1kHz sine wave. It will simultaneously…
samp17
  • 547
  • 1
  • 4
  • 16
3
votes
1 answer

AVAudioEngine.connect crash on hardware not simulator

var engine:AVAudioEngine! var format = engine.inputNode.inputFormat(forBus: 0) engine.connect(engine.inputNode, to: engine.mainMixerNode, format: format) in function AVAudioEngine.connect make my app crash only on hardware but in simulator it's…