2

I have a code records screen with audio from the microphone. When we use RPScreenRecorder.shared().startCapture completion handler returns to us sample buffer with video and audio data. There are three types of buffers (video, audio from the app, audio from the microphone). If you turn on application everything works good, we have all types of sample buffers even audio from microphone, but if we close the app and open it again we will have all types besides audio from the microphone. I really need this audio.

import Foundation
import ReplayKit

class ScreenRecorder {

    public static let shared = ScreenRecorder()
    private var isRecording = false


    func startRecording() {

        guard !isRecording else {
            return
        }

        do {
            try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.ambient)
        } catch let error as NSError {
            print(error.localizedDescription)
        }

        do {
            try AVAudioSession.sharedInstance().setActive(true)
        } catch let error as NSError {
            print(error.localizedDescription)
        }

        RPScreenRecorder.shared().isMicrophoneEnabled = true
        RPScreenRecorder.shared().startCapture(handler: { (cmSampleBuffer, rpSampleBufferType, error) in

            switch rpSampleBufferType {
            case .video:
                print("___video")
            case .audioApp:
                print("___audio app")
            case .audioMic:
                print("___audio mic")
            default:
                break
            }
        }) { error in
            print(error?.localizedDescription)
        }
    }

    func stopRecording() {
        RPScreenRecorder.shared().stopCapture { error in
            print("stop capture")
        }
    }
}

I expect all types of sample buffers.

0 Answers0