Questions tagged [audiounit]

Audio Units are plug-ins for Apple's CoreAudio framework which generate or process audio streams.

Audio Units (commonly referred to as simply "AU") are plug-ins for Apple's CoreAudio system which are capable of generating or processing audio streams. Although Audio Units were originally created to compete with Steinberg's VST SDK, they have evolved in quite different directions.

Currently Mac OSX and iOS are the only platforms which support the Audio Unit SDK. Both OSX and iOS ship with a number of default Audio Units for basic audio stream operations, though it is possible to write custom Audio Units on both platforms.

Resources:

Related tags:

752 questions
0
votes
1 answer

can I use AVAudioPlayer to make an equalizer player?

I want to make an equalizer for music player,which can do some EQ setting like bass and treble,and I want to change the music effect by setting the frequency. 250Hz, 1000Hz, 16000Hz. (void)setEQ:(@"250Hz"); (void)setEQ:(@"1000Hz…
arrfu
  • 182
  • 7
0
votes
0 answers

Trouble using AUViewController in Objective-C

I'm creating an iOS Audio Unit Extension, based on Apple's FilterDemo example (there is no XCode project template yet). The example project is written in Swift, but I'm trying to replace the its Swift ViewController class with an Objective-C…
Bram Bos
  • 135
  • 7
0
votes
0 answers

AudioUnit generating noise with 8000 Sample rate. Xamarin.ios (Monotouch)

I am using AudioUnit class for recording and playback. During recording i can listen sound. Problem is that when i use sample rate 44100 then it working fine but if i use sample rate 8000 then it generating noise. After recording with 8000 sample…
Ashish Kumar
  • 113
  • 2
  • 4
  • 11
0
votes
1 answer

AudioUnit V3: determine sample rate in render callback

How can one determine the sample rate in the render callback? It appears I could override shouldChangeToFormat(_:forBus:) but it seems that such a "should" function shouldn't be used to query state. And each bus has it's own render format, but…
Taylor
  • 5,871
  • 2
  • 30
  • 64
0
votes
1 answer

BAD_ACCESS on RemoteIO callback only when headphone jack plugged in

I have this following render callback for a remoteIO audio unit. Simply accessing the 0th element of the ioData parameter results in a crash. Very simply put, this works with no headphone jack connection but as soon as I plug a jack into my…
Alex Bollbach
  • 4,370
  • 9
  • 32
  • 80
0
votes
1 answer

Accessing BPM and time signature of Plug-In Host

I am working on an audio plugin and would like to map LFOs to various parameters. How does the plug-in access the DAW's BPM value and time signature? Does the host need to expose this through VST or AU protocols or how should a plug-in access…
some_id
  • 29,466
  • 62
  • 182
  • 304
0
votes
1 answer

Changing setPreferredIOBufferDuration at Runtime results in Core Audio Error -50

I am writing an Audio Unit (remote IO) based app that displays waveforms at a given buffer size. The app initially starts off with a preferred buffer size of 0.0001 which results in very small buffer frame sizes (i think its 14 frames). Than at…
0
votes
2 answers

AudioUnit (Mac) AudioUnitRender internal buffer clash

I recently designed a Sound recorder on a mac using AudioUnits. It was designed to behave like a video security system, recording continuously, with a graphics display of power levels for playback browsing. I've noticed that every 85 minutes…
kirkgcm
  • 49
  • 2
  • 5
0
votes
1 answer

Core Audio - Remote IO confusion

I am having trouble interpreting the behavior of the remoteIO audiounit callbacks in iOS. I am setting up a remoteIO unit with two callbacks, one as in input callback and one as an "render" callback. I am following a very similar remoteIO setup as…
Alex Bollbach
  • 4,370
  • 9
  • 32
  • 80
0
votes
0 answers

iOS - Generate tone at specific frequency and volume

I have to make ear testing app like Audiogram. I have to calibrate headphone at specific frequency and volume. I have to generate beep tone at specific frequency and volume and output volume of sound should be same of headphone. Could you please…
0
votes
0 answers

EXC_BAS_ACCESS in Core Audio - writing mic data to file w/ Extended AudioFile Services

I am attempting to write incoming mic audio to a file. Because the audio samples are delivered 4096 frames (the set frame rate for my project) at a time in a time-critical callback I cannot simply write the bytes to a file with AudioFileWriteBytes.…
0
votes
2 answers

What Framework and Header file contain the constant for kAudioUnitSubType_RemoteIO

I keep seeing samples that use kAudioUnitSubType_RemoteIO for the Apple audio unit api. I am wondering however what framework/header file contains this constant.
benstpierre
  • 32,833
  • 51
  • 177
  • 288
0
votes
1 answer

AVAudioEngine offline render: Silent output only when headphones connected

I've been working on an app that makes an audio pipeline through AVAudioEngine and then renders to a file. I've been using this code example's approach, adapted for my own needs. The problem is that if headphones are connected to the device, the…
skattyadz
  • 330
  • 2
  • 9
0
votes
1 answer

Scheduled audio unit parameter events are delayed

I have a music sequencing app for iOS that uses AUSamplers and effects audio units. The main playback loop is implemented using a render notify callback to send note on & off events to the samplers. I have notes successfully playing on beat by…
Jayson
  • 1,689
  • 14
  • 26
0
votes
1 answer

iOS add audio effect before render callback processing?

In my app, I'm doing audio processing in the render callback (input only, no output). Here is how I initialize the audio : -(void) initAudio { OSStatus status; NewAUGraph(&graph); AudioComponentDescription desc; desc.componentType =…
jcr
  • 303
  • 1
  • 6
  • 16