1

I am currently trying to record a video/audio stream using the new iOS 4 AVFoundation library. All is working well when using the pre set AVCaptureDeviceInput devices for the mic and camera, however I wish to use my own audio stream instead of one directly from the mic.

Basically my app uses Audio Unit Remote IO to do some processing on the input from the mic, and it is this post processed buffer I wish to include in my AVCaptureSession rather than direct form the mic.

Anyone know how you can achieve this, or if this is even possible?

Thanks

Toby
  • 77
  • 7
  • I am not familiar with using AudioUnits... Assuming you are using AVAssetWriter to write your movie you could construct a CMSampleBufferRef in your audio render callback. Then you could pass this to the delegate used to write to AVAssetWriter. You need to keep in mind the timestamps to keep the audio in sync with the video. This should also occur in real time, or very very close to it. There may be an easier way depending on your processing needs. You could always just process the mic audio directly from the CMSampleBufferRef. – Steve McFarlin Mar 08 '11 at 19:48
  • I was using AVCaptureFileOutput, but now I see AVAssetWriter gives you much more control over the sample buffers. You can also convert between a Core Audio AudioBufferList and a CMSampleBufferRef using CMSampleBufferSetDataBufferFromAudioBufferList so I will give that a try. – Toby Mar 09 '11 at 09:31
  • Did you have any luck using the AVAssetWriter? Do you have any sample code? All I need to do is capture Audio that is being played by AVAudioPlayer. – jangelo42 Feb 06 '13 at 07:47
  • Im afraid I shelved this project for the time being, couldn't work it out – Toby Mar 30 '13 at 08:10

0 Answers0