Questions tagged [audiounit]

Audio Units are plug-ins for Apple's CoreAudio framework which generate or process audio streams.

Audio Units (commonly referred to as simply "AU") are plug-ins for Apple's CoreAudio system which are capable of generating or processing audio streams. Although Audio Units were originally created to compete with Steinberg's VST SDK, they have evolved in quite different directions.

Currently Mac OSX and iOS are the only platforms which support the Audio Unit SDK. Both OSX and iOS ship with a number of default Audio Units for basic audio stream operations, though it is possible to write custom Audio Units on both platforms.

Resources:

Related tags:

752 questions
1
vote
1 answer

Difference between kAudioSessionProperty_OverrideAudioRoute & kAudioSessionProperty_OverrideCategoryDefaultToSpeaker

I'm confused by the audio route override in the iOS, and doesn't understand the document about the difference between kAudioSessionProperty_OverrideAudioRoute & kAudioSessionProperty_OverrideCategoryDefaultToSpeaker at Apple documentation So what's…
ZijingWu
  • 3,350
  • 3
  • 25
  • 40
1
vote
1 answer

Audio Unit for play and record without filters

I'm developing an app that sets an audio unit to record and play that use a callback for audio processing. I use a callback because I have to implement my own filters in c. I'm using an Audio Unit with kAudioUnitType_Output and…
DrCachetes
  • 954
  • 1
  • 9
  • 30
1
vote
2 answers

iOS: iLBC codec using Audio Units

Right now I am getting PCM audio in my Audio Unit proc, which writes incoming audio buffer data into a Circular Buffer to be used somewhere else. Now I would like to get iLBC audio data, so I changed the AudioStreamBasicDescription mFormatID to…
DeveloBär
  • 673
  • 8
  • 20
1
vote
2 answers

AudioUnit + Opus codec = crackle issue

I am creating a voip app for iOS in objective-c. Currently i am trying to create the audio part: recording the audio data from microphone, encoding with Opus, decoding, and then playing. For the recording and playing i use AudioUnit. Also i made a…
1
vote
1 answer

Using AudioBufferList with Swift once again

Refering to Using AudioBufferList with Swift I found the following solution here on Stack Overflow for playing some sound with audio unit. My problem here is that I'm not able to put actual data in my buffer like sin wave data. I tried it with…
easysaesch
  • 159
  • 1
  • 14
1
vote
1 answer

How to record, modify and playback simultaneously?

I'm working on an iOS app. The app is aimed to simulate a hearing aid. It should be able to first record the sound, then modify the sound (like filtering or spectral enhancement), and at last playback. The whole process should be done real timely,…
Sie Kensou
  • 65
  • 6
1
vote
2 answers

iOS: Changing sample rate dynamically in Audio Unit

Is it possible to change/set sample rate in the middle of a running AudioSession/AudioUnit without stopping/restarting the current AudioSession/AudioUnit (Just like audio route) ? I have an active audio session whose sample rate is 44.1…
Partho Biswas
  • 2,290
  • 1
  • 24
  • 39
1
vote
0 answers

CoreAudio: Creating kAudioUnitSubType_Reverb2 presets

I have a basic workflow for an iOS app in which .aupreset files are created for various AudioUnits such as AUSampler, delay, etc.. and are then loaded in my iOS app. I need a reverb effect and see that the available reverb for desktop (which AULab…
1
vote
1 answer

VoIP limiting the number of frames in rendercallback

I am currently developing a VoIP application and one of the libraries I am using requires me to send the frames in the input call back. The requirements is that I must send a sample count which is defined as the number of samples in a frame. This…
cvu
  • 482
  • 2
  • 6
  • 20
1
vote
3 answers

How to play sound only in one ear i.e. left or right at a time using AudioUnit

AudioComponentDescription defaultOutputDescription; defaultOutputDescription.componentType = kAudioUnitType_Output; defaultOutputDescription.componentSubType = kAudioUnitSubType_RemoteIO; defaultOutputDescription.componentManufacturer =…
1
vote
1 answer

Unable to Save Performance Parameters in AUSampler

I'm trying to connect a performance parameter to control the amplifier gain of an AUSampler in AU Lab but I'm unable to save the parameter. When I click to another tab I get a message that says: You have a partially created performance parameter.…
Youngin
  • 261
  • 4
  • 14
1
vote
1 answer

How to get an AudioUnit property value in Xamarin.iOS

I'm working on an audio debug feature and trying to get the AudioUnit.AudioUnitPropertyIDType.Latency property value of my audio unit using Xamarin.iOS. Unfortunately I don't see related method to retrieve the value property value like…
Mando
  • 11,414
  • 17
  • 86
  • 167
1
vote
1 answer

Core Audio Swift Equalizer adjusts all bands at once?

I am having trouble setting up a kAudioUnitSubType_NBandEQ in Swift. Here is my code to initialize the EQ: var cd:AudioComponentDescription = AudioComponentDescription(componentType: OSType(kAudioUnitType_Effect),componentSubType:…
Paul Lehn
  • 3,202
  • 1
  • 24
  • 29
1
vote
1 answer

iOS audio system. Start & stop or just start?

I have an app, where audio recording is the main and the most important part. However user can switch to table view controller where all records are displayed and no recording is performed. The question is what approach is better: "start & stop…
galarius
  • 108
  • 9
1
vote
1 answer

Mac OS X: Audio frequency shift by change of sample rate?

I want to change the frequency of a voice recording by changing sample rate on Mac OS X. This is a research project aimed at people who stutter. It's essential that the latency is very low – this is, for instance, why I'm not considering Fast…
m-ga
  • 11
  • 3