0

I want to analyze the microphone input frequency and then play the correct note which is near the frequency which was determined. I did that with of AudioKit.

This is working right now but since I implemented AudioKit to get the frequency feature the sound which plays after the frequency detection cracks sometimes during playback. Thats happened after I implemented AudioKit. Everything was fine before that...

var mic: AKMicrophone!
var tracker: AKFrequencyTracker!
var silence: AKBooster!

func initFrequencyTracker() {
        AKSettings.channelCount = 2
        AKSettings.audioInputEnabled = true
        AKSettings.defaultToSpeaker = true
        AKSettings.allowAirPlay = true
        AKSettings.useBluetooth = true
        AKSettings.allowHapticsAndSystemSoundsDuringRecording = true
        mic = AKMicrophone()
        tracker = AKFrequencyTracker(mic)
        silence = AKBooster(tracker, gain: 0)
    }

    func deinitFrequencyTracker() {
        AKSettings.audioInputEnabled = false
        plotTimer.invalidate()
        do {
            try AudioKit.stop()
            AudioKit.output = nil
        } catch {
            print(error)
        }
    }

    func initPlotTimer() {
        AudioKit.output = silence
        do {
            try AKSettings.setSession(category: .playAndRecord, with: [.defaultToSpeaker, .allowBluetooth, .allowAirPlay, .allowBluetoothA2DP])
            try AudioKit.start()
        } catch {
            AKLog("AudioKit did not start!")
        }
        setupPlot()
        plotTimer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(updatePlotUI), userInfo: nil, repeats: true)
    }

    func setupPlot() {
        let plot = AKNodeOutputPlot(mic, frame: audioInputPlot.bounds)
        plot.translatesAutoresizingMaskIntoConstraints = false
        plot.alpha = 0.3
        plot.plotType = .rolling
        plot.shouldFill = true
        plot.shouldCenterYAxis = false
        plot.shouldMirror = true
        plot.color = UIColor(named: uiFarbe)
        audioInputPlot.addSubview(plot)

        // Pin the AKNodeOutputPlot to the audioInputPlot
        var constraints = [plot.leadingAnchor.constraint(equalTo: audioInputPlot.leadingAnchor)]
        constraints.append(plot.trailingAnchor.constraint(equalTo: audioInputPlot.trailingAnchor))
        constraints.append(plot.topAnchor.constraint(equalTo: audioInputPlot.topAnchor))
        constraints.append(plot.bottomAnchor.constraint(equalTo: audioInputPlot.bottomAnchor))
        constraints.forEach { $0.isActive = true }
    }

    @objc func updatePlotUI() {
        if tracker.amplitude > 0.3 {
            let trackerFrequency = Float(tracker.frequency)

            guard trackerFrequency < 7_000 else {
                // This is a bit of hack because of modern Macbooks giving super high frequencies
                return
            }



            var frequency = trackerFrequency
            while frequency > Float(noteFrequencies[noteFrequencies.count - 1]) {
                frequency /= 2.0
            }
            while frequency < Float(noteFrequencies[0]) {
                frequency *= 2.0
            }

            var minDistance: Float = 10_000.0
            var index = 0

            for i in 0..<noteFrequencies.count {
                let distance = fabsf(Float(noteFrequencies[i]) - frequency)
                if distance < minDistance {
                    index = i
                    minDistance = distance
                }
                print(minDistance, distance)
            }
            //                let octave = Int(log2f(trackerFrequency / frequency))

            frequencyLabel.text = String(format: "%0.1f", tracker.frequency)

            if frequencyTranspose(note: notesToTanspose[index]) != droneLabel.text {
                momentaneNote = frequencyTranspose(note: notesToTanspose[index])
                droneLabel.text = momentaneNote
                stopSinglePlayer()
                DispatchQueue.main.asyncAfter(deadline: .now() + 0.03, execute: {
                    self.prepareSinglePlayerFirstForStart(note: self.momentaneNote)
                    self.startSinglePlayer()
                })
            }

        }
    }

    func frequencyTranspose(note: String) -> String {
        var indexNote = notesToTanspose.firstIndex(of: note)!
        let chosenInstrument = UserDefaults.standard.object(forKey: "whichInstrument") as! String
        if chosenInstrument == "Bb" {
            if indexNote + 2 >= notesToTanspose.count {
                indexNote -= 12
            }
            return notesToTanspose[indexNote + 2]
        } else if chosenInstrument == "Eb" {
            if indexNote - 3 < 0 {
                indexNote += 12
            }
            return notesToTanspose[indexNote - 3]
        } else {
            return note
        }
    }

1 Answers1

0

Appears that your implementation can be improved slightly by putting the multithreading principles of iOS into practice. Now, I'm not an expert in the subject, but if we look into the statement: "the sound which plays after the frequency detection cracks sometimes during playback".

I'd like to point out that the "frequency" of the "crack" is random or unpredictable and this happens during computation.

So, move code that doesn't need to be computed in the main thread to a background thread (https://developer.apple.com/documentation/DISPATCH)

enter image description here

While refactoring, you can test your implementation by increasing the frequency of calls to the callback computation of your Timer, so reduce the value to 0.05 for example. Which means that if you increase the frequency to, let's say 0.2, you'll probably hear less random crackles.

Now, this is easier said than done when considering concurrency but that's what you need to improve.

punkbit
  • 7,347
  • 10
  • 55
  • 89
  • It looks like the problem is because of the input or output devices. I´m not yet able to specify them correctly. I encountered that the cracks only appear when I use AirPods. Maybe AudioKit switches to the AirPods mic? – lololarumpel Jun 06 '20 at 16:54
  • @lololarumpel you'll have to disable it from the settings, and then test without it, but it seems to me that you're having a common issue found every time in audio programming or real-time audio programming. I'd advise following my suggestions as you'll certainly find evidence confirming what's generated is "Audio glitches", if you compute all in the main thread and try to interact with the UI, you'll hear it straight away (that's one way to replicate it). You can read about some rules of audio dev here (http://www.rossbencina.com/code/real-time-audio-programming-101-time-waits-for-nothing) – punkbit Jun 06 '20 at 21:04