Questions tagged [audiounit]

Audio Units are plug-ins for Apple's CoreAudio framework which generate or process audio streams.

Audio Units (commonly referred to as simply "AU") are plug-ins for Apple's CoreAudio system which are capable of generating or processing audio streams. Although Audio Units were originally created to compete with Steinberg's VST SDK, they have evolved in quite different directions.

Currently Mac OSX and iOS are the only platforms which support the Audio Unit SDK. Both OSX and iOS ship with a number of default Audio Units for basic audio stream operations, though it is possible to write custom Audio Units on both platforms.

Resources:

Related tags:

752 questions
9
votes
1 answer

How do you set the input level (gain) on the built-in input (OSX Core Audio / Audio Unit)?

I've got an OSX app that records audio data using an Audio Unit. The Audio Unit's input can be set to any available source with inputs, including the built-in input. The problem is, the audio that I get from the built-in input is often clipped,…
jnpdx
  • 45,847
  • 6
  • 64
  • 94
8
votes
5 answers

How to make a simple EQ AudioUnit (bass, mid, treble) with iOS?

does anyone know how to make a simple EQ audio unit (3 bands - low, mid, hi) with iOS ? I know how to add an iPod EQ Audio Unit to my AU Graph. But it only give you access to presets and I need proper control of the EQ. I've looked around for some…
André
  • 671
  • 1
  • 7
  • 21
8
votes
0 answers

Routing AVSpeechSynthesizer to AU Host in iOS

Is it possible to grab/save/route the output of the AVSpeechSynthesizer so that it can be manipulated as audio so I can run audio processing on the audio (kAudioUnitSubType_Reverb2 ) and then play it?
grigb
  • 1,151
  • 8
  • 14
8
votes
1 answer

iOS: Mute right channel volume

I'm trying to mute the right channel for all audio apart from an audio stream that I control. I am using a number of libraries playing audio including OpenEars for Text-to-speech and I would like all of them to only play out the left headphone…
Josh
  • 211
  • 2
  • 5
7
votes
1 answer

How to set the reverb level and time on kAudioUnitSubType_Reverb2

I've managed to add a reverb unit to my graph, more or less like so: AudioComponentDescription auEffectUnitDescription; auEffectUnitDescription.componentType = kAudioUnitType_Effect; auEffectUnitDescription.componentSubType =…
morgancodes
  • 25,055
  • 38
  • 135
  • 187
7
votes
0 answers

AVAudioPlayer volume low with VoiceProcessingIO

When using kAudioUnitSubType_VoiceProcessingIO combined with AVAudioPlayer this leads to the audio playback volume being pretty low. When switching to kAudioUnitSubType_RemoteIO the playback volume is again on a proper high level. It is depending on…
Martin Mlostek
  • 2,755
  • 1
  • 28
  • 57
7
votes
2 answers

AURenderCallback in Swift

I am creating an application that uses Audio Units, and while there are many examples of code in Objective-C (including Apple's own aurioTouch and others) I am attempting to code the entire thing in Swift. I have been able to set up my AUGraph and…
7
votes
1 answer

iPhone: Change playback speed with Audio Units

What are the different ways to change the playback speed of audio on the iPhone, when using Audio Units? What are the advantages / disadvantages of each solution? I have a mixer unit and an IO unit. Do I need to add another unit (eg. converter…
Tom Ilsinszki
  • 613
  • 1
  • 6
  • 14
7
votes
2 answers

iOS how to play midi notes?

I have searched and already have done an OS X app that can play MIDI notes, but when i tried in iOS, nothing happened. Here is the core code: AUGraph graph; AudioUnit synthUnit; AUNode synthNode,…
Smeegol
  • 2,014
  • 4
  • 29
  • 44
7
votes
1 answer

iOS AudioUnit settings to save mic input to raw PCM file

I'm currently working on a VOIP project for iOS. I use AudioUnits to get data from the mic and play sounds. My main app is written in C# (Xamarin) and uses a C++ library for faster audio and codec processing. To test the input/output result I'm…
Boardwish
  • 495
  • 3
  • 16
7
votes
3 answers

Setting sample rate on AUHAL

I'm using Audio Unit Framework to develop a VOIP app on mac os x. In my program, I set up an input AUHAL and use the default stream format (44.1kHz,32bit/channel) to capture the audio from mic. In this case, my program works fine. Here is the…
Jun Liu
  • 133
  • 1
  • 8
7
votes
2 answers

the amazing audio engine how to apply filters to microphone input

Im trying to make karaoke app that records the background music from file and the microphone. I also want to add filter effects to the microphone input. i can do everything stated above using the amazing audio engine sdk but i cant figure out how to…
user513790
  • 1,225
  • 1
  • 13
  • 22
7
votes
0 answers

Is the AudioFilePlayer audio unit sandbox compatible?

I've run into a problem using the AudioFilePlayer audio unit with app sandboxing enabled on OS X 10.8. I have an AUGraph with only two nodes, consisting of an AudioFilePlayer unit connected to a DefaultOutput unit. The goal (right now) is to simply…
Andrew Madsen
  • 21,309
  • 5
  • 56
  • 97
7
votes
1 answer

iOS AudioUnits pass through

I am trying to write an iOS application that will pass the sound received from microphone to speaker without any changes. I've read apple docs and guides. I choosed the first pattern from this guide. But nothing happening - silence. As you can see…
Max Komarychev
  • 2,836
  • 1
  • 21
  • 32
7
votes
2 answers

Understanding Remote I/O AudioStreamBasicDescription (ASBD)

I need help understanding the following ASBD. It's the default ASBD assigned to a fresh instance of RemoteIO (I got it by executing AudioUnitGetProperty(..., kAudioUnitProperty_StreamFormat, ...) on the RemoteIO audio unit, right after allocating…
1 2
3
50 51