0

I am working on loading a (local) movie into AVPlayer and applying processing to the audio track with an audioTapProcessor. So far I've found great GitHub examples here, here, and here. I'm using the "tap cookie" approach used in the last link and in an answer to this previous question.

Audio & video playback are working fine. However, my tapPrepare and tapProcess callbacks are not being called, but Init and Finalize are. So I'm doing something both right and wrong. relevant code attached -- Any help appreciated!

import Foundation
import AVFoundation
import AudioToolbox
import MediaToolbox
import CoreAudioTypes

class PlayerViewController: UIViewController {
    
    class TapCookie {
        weak var content: PlayerViewController?
        deinit {
            print("TapCookie deinit")   // appears after tapFinalize
        }
    }
    // MARK: Properties
    var playerAsset: AVURLAsset?
    var playerItem: AVPlayerItem! = nil
    var audioProcessingFormat: AudioStreamBasicDescription?
    private var tracksObserver: NSKeyValueObservation?
    
    // MARK: Button to trigger actions
    @IBAction func selectVideo(_ sender: Any) {
        // starts doing stuff:
        // - select a video file from device, extract movieURL string ...
        playerAsset = AVURLAsset(url: movieURL)
        playerItem = AVPlayerItem(url: movieURL)
        //... then send asset to AVPlayer (not shown)
        
        // set up audioProcessingTap
        tracksObserver = playerItem.observe(\AVPlayerItem.tracks, options: [.initial, .new]) {
            [unowned self] item, change in
            installTap(playerItem: playerItem)
        }
    }
    func installTap(playerItem: AVPlayerItem) {
        let cookie = TapCookie()
        cookie.content = self
                
        var callbacks = MTAudioProcessingTapCallbacks(
            version: kMTAudioProcessingTapCallbacksVersion_0,
            clientInfo: UnsafeMutableRawPointer(Unmanaged.passRetained(cookie).toOpaque()),
            init: tapInit,
            finalize: tapFinalize,
            prepare: tapPrepare,
            unprepare: tapUnprepare,
            process: tapProcess)
        
        var tap: Unmanaged<MTAudioProcessingTap>?
        let err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap)
        assert(noErr == err);
        // tapInit successfully called after MTAudioProcessingTapCreate
                
        let audioMix = AVMutableAudioMix()
        let audioTrack = playerItem.asset.tracks(withMediaType: AVMediaType.audio).first! //use first audio track
        let inputParams = AVMutableAudioMixInputParameters(track: audioTrack)
        
        inputParams.audioTapProcessor = tap?.takeRetainedValue()
        audioMix.inputParameters = [inputParams]
        playerItem.audioMix = audioMix
    }
    // MARK: install tap callbacks

    let tapInit: MTAudioProcessingTapInitCallback = {
        (tap, clientInfo, tapStorageOut) in
        
        tapStorageOut.pointee = clientInfo
        print("tapInit tap: \(tap)\n clientInfo: \(String(describing: clientInfo))\n tapStorageOut: \(tapStorageOut)\n")
    }
    // tapPrepare not called !!
    let tapPrepare: MTAudioProcessingTapPrepareCallback = {
        (tap, maxFrames, processingFormat) in
        print("tapPrepare tap: \(tap), maxFrames: \(maxFrames)\n processingFormat: \(processingFormat)")
        
        let cookie = Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()
        
        cookie.content!.audioProcessingFormat = AudioStreamBasicDescription(mSampleRate: processingFormat.pointee.mSampleRate,
                                                                mFormatID: processingFormat.pointee.mFormatID,
                                                                   mFormatFlags: processingFormat.pointee.mFormatFlags,
                                                                   mBytesPerPacket: processingFormat.pointee.mBytesPerPacket,
                                                                   mFramesPerPacket: processingFormat.pointee.mFramesPerPacket,
                                                                   mBytesPerFrame: processingFormat.pointee.mBytesPerFrame,
                                                                   mChannelsPerFrame: processingFormat.pointee.mChannelsPerFrame,
                                                                   mBitsPerChannel: processingFormat.pointee.mBitsPerChannel,
                                                                   mReserved: processingFormat.pointee.mReserved)
    }
    
    let tapUnprepare: MTAudioProcessingTapUnprepareCallback = {
        (tap) in
        print("tapUnprepare \(tap)")

    }
    
    // tapProcess not called !!
    let tapProcess: MTAudioProcessingTapProcessCallback = {
        (tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut) in
        print("tapProcess \(tap)\n \(numberFrames)\n \(flags)\n \(bufferListInOut)\n \(numberFramesOut)\n \(flagsOut)\n")
        
        let status = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut)
        if noErr != status {
            print("get audio: \(status)")
        }
        
        let cookie = Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()
        guard let cookieContent = cookie.content else {
            print("Tap callback: cookie content was deallocated!")
            return
        }
            // process audio here...
    }
    
    let tapFinalize: MTAudioProcessingTapFinalizeCallback = {
        (tap) in
        print("tapFinalize \(tap)")
        // release cookie
        Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).release()
    }
}
eefanatic
  • 1
  • 1

1 Answers1

0

You need to create an AVPlayer

player = AVPlayer(playerItem: playerItem)

and then at some point start it playing:

player.play()

Then the prepare and process callbacks will be called.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • Thanks -- I do indeed have an AVPlayer and use its *.play() function, but I don't see prepare and process executing. As I mentioned, the video playback works fine, so I didn't include the actual AVPlayer code because I figured it would be intuitive. Sorry for the confusion! – eefanatic Apr 07 '22 at 16:48
  • And just for more context, I'm hosting the AVPlayer in a UIView / AVPlayerLayer , and I added a play/pause button that changes state based on AVPlayer.timeControlStatus observation, .. – eefanatic Apr 07 '22 at 16:55
  • Do the callbacks get called in the simulator? – Rhythmic Fistman Apr 07 '22 at 17:51
  • I didn't see them in the simulator at first, but then I made these changes - - remove global `var playerItem: AVPlayerItem` and all references to it - removed tracks observer for `AVPlayerItem.tracks` (I don't think it's needed, but please correct me if I'm wrong) - in the input to `installTap()` fcn, use `AVPlayer.currentItem` (which is an AVPlayerItem object) and now I am seeing Prepare and Process happening ~~ is it expected that tapProcess will be executed many, many times continuously? – eefanatic Apr 07 '22 at 19:41
  • What global playerItem? There is no mention of that in the question. – Rhythmic Fistman Apr 07 '22 at 19:43
  • referring to the top-level `var playerItem` I was using to store the AVPlayerItem info. But it looks like it was redundant since I'm setting up the AVPlayer with an AVURLAsset, and I can access the PlayerItem from inside the AVPlayer (with currentItem property) – eefanatic Apr 07 '22 at 19:56
  • Pick either the player item or that asset, not both – Rhythmic Fistman Apr 07 '22 at 20:09