0

Edit 2: When I don't pause the audio engine and just the player node and then resume the player node, there is no ui lag or anything at all. Smooth playback with no crackling. But I have to pause the audio engine to update the MPRemoteCommandCenter Playback controls so I am kinda stuck.

Edit: The delay is much much more apparent when using bluetooth headphones such as AirPods.

I have attached the entire code below. It is a very very basic app to reproduce the lag issue I am having in my bigger app. You could literally copy the entire code below into a SwiftUI Project and it will run.

I am unable to find the source of the lag that is being caused in the UI. Pressing the pause button is very smooth. When I pause and I wait for a couple of seconds (especially like 5 seconds or longer) and then press play, I could see that it takes some time for the player node to resume.

Furthermore, I could hear some crackling sounds when I connect more audio units such as reverb and pitch.

Could someone explain what's the cause and issue for the UI lag and the crackling issue?

I did profile the code below, and quite a number of micro hangs were registered. My bigger app registered much more longer hangs.

struct PlayerButtonStyle: ButtonStyle {
        func makeBody(configuration: Configuration) -> some View {
            configuration.label
                .font(.system(size: 35))
                .foregroundColor(.white)
                .padding(15)
                .background(configuration.isPressed ? .gray.opacity(0.3) : .clear)
                .clipShape(Circle())
                .scaleEffect(configuration.isPressed ? 0.8 : 1.0)
        }
    }

struct ContentView: View {
    @State private var isPlaying = false
    
    var body: some View {
        VStack(spacing: 50) {
            
            Button {
                withAnimation {
                    if isPlaying {
                        AudioManager.shared.pause()
                        isPlaying = false
                    } else {
                        AudioManager.shared.play()
                        isPlaying = true
                    }
                }
            } label: {
                if isPlaying {
                    Image(systemName: "pause.fill")
                } else {
                    Image(systemName: "play.fill")
                }
            }
            .buttonStyle(PlayerButtonStyle())

            Scrubber()
        }
        .padding()
    }
}

struct Scrubber: View {
    @EnvironmentObject var viewModel: ViewModel
    @State private var sliderValue = 0.0
    
    var body: some View {
        Slider(value: $sliderValue, in: 0...(AudioManager.shared.file?.duration ?? 0))
            .onReceive(viewModel.$sliderValue) { _ in
                sliderValue = viewModel.sliderValue
            }
    }
}

class ViewModel: ObservableObject {
    @Published var sliderValue = 0.0
}

extension AVAudioFile {
    //Duration
    var duration: TimeInterval {
        let sampleRateSong = Double(processingFormat.sampleRate)
        let lengthSongSeconds = Double(length) / sampleRateSong
        return lengthSongSeconds
    }
}

extension AVAudioPlayerNode {
    //Current Time
    var currentTime: TimeInterval {
        if let nodeTime = lastRenderTime, let playerTime = playerTime(forNodeTime: nodeTime) {
            return Double(playerTime.sampleTime) / outputFormat(forBus: 0).sampleRate
        }
        return 0
    }
}

class AudioManager {
    static let shared = AudioManager()
    
    var viewModel = ViewModel()
    
    let engine = AVAudioEngine()
    let player = AVAudioPlayerNode()
    let pitchNode = AVAudioUnitTimePitch()
    let reverb = AVAudioUnitReverb()
    
    var file: AVAudioFile?
    var displayLink: CADisplayLink?
    
    init() {
        setupSlider()
        prepareToPlay()
        enableBackgroundPlay()
        setupNowPlaying()
        setupMediaPlayerControls()
    }
    
    func setupSlider() {
        displayLink = CADisplayLink(target: self, selector: #selector(updateScrubber))
        displayLink?.add(to: .current, forMode: .default)
        displayLink?.preferredFrameRateRange = CAFrameRateRange(minimum: 2, maximum: 3)
        displayLink?.isPaused = true
    }
    
    @objc func updateScrubber() {
        viewModel.sliderValue = player.currentTime
    }
    
    
    func prepareToPlay() {
        let url = Bundle.main.url(forResource: "Stringed Disco", withExtension: "mp3")!
        do {
            file = try AVAudioFile(forReading: url)
            
            engine.attach(player)
            engine.connect(player, to: engine.outputNode, format: file?.processingFormat)

            
            if let file = file {
                player.scheduleFile(file, at: nil, completionCallbackType: .dataPlayedBack) { _ in
                    
                }
            }
            
            try engine.start()
            displayLink?.isPaused = false
            
        } catch {
            print(error.localizedDescription)
        }
    }
    
    func play() {
        do {
            try engine.start()
        } catch {
            print(error.localizedDescription)
        }
        
        player.play()
        displayLink?.isPaused = false
    }
    
    func pause() {
        player.pause()
        engine.pause()
        displayLink?.isPaused = true
    }
    
    func enableBackgroundPlay() {
        let session = AVAudioSession.sharedInstance()
        do {
            try session.setCategory(.playback, mode: .default)
            try session.setActive(true)
        } catch {
            print(error.localizedDescription)
        }
    }
    
    func setupNowPlaying() {
        // Define Now Playing Info
        var nowPlayingInfo = [String : Any]()
        nowPlayingInfo[MPMediaItemPropertyTitle] = "Stringed Disco"
        nowPlayingInfo[MPMediaItemPropertyArtist] = "Kevin MacLeod"

        nowPlayingInfo[MPNowPlayingInfoPropertyElapsedPlaybackTime] = viewModel.sliderValue
        nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = file?.duration
        
        // Set the metadata
        MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
    }
    
    func setupMediaPlayerControls() {
        //Set up player controls
        let commandCentre = MPRemoteCommandCenter.shared()

        commandCentre.playCommand.addTarget { [unowned self] event in
            play()
                        
            return .success
        }
        
        commandCentre.pauseCommand.addTarget { [unowned self] event in
            pause()
            
            return .success
        }
    }
}
SwiftEnthusiast
  • 402
  • 2
  • 10
  • Have you tried putting `AudioManager.shared.play()` in background thread? – Ghulam Mustafa Dec 14 '22 at 09:57
  • 1
    Yes, I did try that. It makes the UI responsive, but there is still a noticeable delay when resuming the AVAudioEngine. Furthermore, a lot of people discouraged from using background threads for AVFoundation stuff so not sure what to do. – SwiftEnthusiast Dec 14 '22 at 10:11
  • Most of this looks pretty good. I would try getting rid of pausing and restarting the engine. That shuts down the hardware, which you then need to start back up, which is probably what's adding a couple of seconds. (Apple recommends pausing the engine to save battery, and that's correct, but the trade-off is some lagging when you have to power it back up. If it were free, Apple would power down the hardware automatically when it wasn't playing anything.) – Rob Napier Dec 14 '22 at 15:27
  • When you said you attach reverb and pitch, I assume you mean you do that when the engine is stopped, though. You will get distortion if you modify the graph while it's running. (If you want to turn off an audio unit, set its `bypass` to true rather than removing it from the graph.) – Rob Napier Dec 14 '22 at 15:34
  • Thank you very much for your reply. The issue then is that if I don’t pause the AVAudioengine, MPRemoteCommandCenter controls (specifically play and pause) won’t update at all. Apparently, I have to pause AVAudioEngine for it to update correctly since its output mode is the node that is connected to MPRemoteCommandCenter it seems. – SwiftEnthusiast Dec 14 '22 at 16:08

0 Answers0