Questions tagged [audiounit]

Audio Units are plug-ins for Apple's CoreAudio framework which generate or process audio streams.

Audio Units (commonly referred to as simply "AU") are plug-ins for Apple's CoreAudio system which are capable of generating or processing audio streams. Although Audio Units were originally created to compete with Steinberg's VST SDK, they have evolved in quite different directions.

Currently Mac OSX and iOS are the only platforms which support the Audio Unit SDK. Both OSX and iOS ship with a number of default Audio Units for basic audio stream operations, though it is possible to write custom Audio Units on both platforms.

Resources:

Related tags:

752 questions
1
vote
1 answer

How to resolve "Hardware In Use" issue (error code: 'hwiu')?

I have created an iPhone app with recording with AudioUnit, Conversion, Audio Editing and Merging parts. I done everything except Conversion. This app will work only in iOS 4 or higher. I tried to convert .caf to .m4a file. But I am getting…
jfalexvijay
  • 3,681
  • 7
  • 42
  • 68
1
vote
0 answers

Using AVAudioUnitSampler as Audio Unit v3 Extension – distorted sound

I have a music app that uses AVAudioUnitSampler to generate sounds. I'm trying to make an Audio Unit Extension, so my app will be able to be used in a host apps like Garage Band. I implement AUViewController and return underlying…
Pavel Alexeev
  • 6,026
  • 4
  • 43
  • 51
1
vote
1 answer

Core Audio sound metering on multiple file players

I have an AUGraph setup with a couple of File Player audio units feeding into a MultiChannelMixer unit, which then feeds into a Remote I/O output. This setup has worked just fine. Now I've been struggling to add a callback in such a way that I can…
Leonhard Printz
  • 355
  • 2
  • 16
1
vote
1 answer

RemoteIO on iOS11 with AirPods

In iOS 10 or below, when we had an AVAudioSession set with playAndRecord category, AirPods used to be picked up as default input & output when configuring RemoteIO. I could supress output by silencing the samples in the callback but starting with…
Deepak Sharma
  • 5,577
  • 7
  • 55
  • 131
1
vote
1 answer

The codec could not be accessed. (-66672)

I am trying to convert caf file to m4a file using AudioUnit. I have implemented the code to convert. When I tried to run the application, I am getting following error message; couldn't set destination client format (-66672) I got the sample code…
jfalexvijay
  • 3,681
  • 7
  • 42
  • 68
1
vote
1 answer

OSX CoreAudio query for bluetooth device manufacturer returns NULL - is there any way for me to then reference this device for AudioComponentFindNext?

I am querying all active input devices in osx and then try to use AudioUnit for playing audio through the bluetooth device (if connected). I have a bluetooth device that returns a UID and device name but fails to return a device manufacturer…
spartygw
  • 3,289
  • 2
  • 27
  • 51
1
vote
0 answers

Using 3rd party audio units (.component) in application

Can I use 3rd party audio units with extension .component in application, copying AU to project? I know, that I can copy .component to ~/Library/Audio/Plug-Ins/Components/ and then get all components by AudioComponentDescription descr =…
Dmitrii
  • 85
  • 1
  • 7
1
vote
0 answers

Stop AudioUnit speech

I'm implementing a speech synthesizer using Audio Unit, based on the Core Audio examples. Everything works as expected, except that StopSpeech and StopSpeechAt appear to do nothing. Here are the speak and stop methods: void Synthesizer::speak( const…
Ian
  • 2,078
  • 1
  • 17
  • 27
1
vote
0 answers

iOS AudioUnit callback hiccup on going to background mode

I'm writing an application to play RTP audio stream on iOS devices. It works well on both foreground and background modes, but while switching to background it kind of stuck for about a second. Trying to investigate that issue, i found what during…
1
vote
2 answers

Effect AudioUnit Only Calling Render Callback Once

What I'm trying to accomplish is to process an array of audio data through a Core Audio effect unit and get the manipulated data back (without playing it -- i.e. offline). I've hit a wall and it's probably something very basic that I'm not…
Brian Toth
  • 529
  • 4
  • 14
1
vote
0 answers

When is mChannelLayoutTag not kAudioChannelLayoutTag_UseChannelDescriptions

I was wondering if someone could tell me when mChannelLayoutTag in AudioChannelLayout is not kAudioChannelLayoutTag_UseChannelDescriptions, if the AudioChannelLayout is returned from AudioUnitGetProperty with…
CMChang
  • 51
  • 5
1
vote
1 answer

Objective-C/Swift usage in AudioUnit Render callback

Apple advices against use of Objective-C and Swift in AudioUnit input or render callbacks. So the callbacks are written mostly in C where we can quickly copy data and pass that data to secondary thread for processing. To share variables between C…
Deepak Sharma
  • 5,577
  • 7
  • 55
  • 131
1
vote
0 answers

Format settings for iOS multimixer au when using a bluetooth endpoint

Hi Core Audio/Au community, I have hit a roadbloc during development. My current AUGraph is set up as 2 Mono streams->Mixer unit->remoteIO unit on an iOS platform. I am using the mixer to mix two mono stream into stereo interleaved. However, the…
SamB
  • 21
  • 2
1
vote
1 answer

ExtAudioFileRead is too slow. How to make it faster?

I've written my own audio library. (Silly, I know but I enjoy it.) Right now, I'm using it to play stereo files from the iPod library on iOS. It works well except that sometimes the call to ExtAudioFileRead() takes longer than 23ms. Since my…
1
vote
0 answers

Using AudioUnit AUConverter to upsample causes the playback speed to increase

My current AUGraph is set up as Mixer unit->remoteIO unit on an iOS platform. I am using the mixer to mix two mono stream into stereo interleaved.Everything works fine when the sample rates of the streams and the hardware match i.e 44100. When the…
SamB
  • 21
  • 2