4

I'm struggling with an issue: I have AVCaptureSession with preview layer and I also want to provide haptic feedbacks when user taps on buttons. If I add audioInput to my AVCaptureSession then I'm not able to produce haptic feedback at all. I tried to to add audioInput right before starting record and to remove immediately after stopping but modifying capture session configuration (which I did in serial queue) leads to video preview hiccups (it's disrupted for a fraction of a second). And I still have no idea how Snapchat and Instagram do this trick. One of my guesses was that they somehow configure AVAudioSession but I couldn't figure it out.

My capture session initialization is pretty general so I won't paste it (notable here is that I have captureSession.automaticallyConfiguresApplicationAudioSession = false line and shared AVAudioSession is modified like so AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, mode: AVAudioSessionModeVideoRecording, options: [.mixWithOthers])), but I'll post my attempt to toggle audio input:

func addAudioInput() {
    self.sessionQueue.async { [unowned self] in
        self.captureSession.beginConfiguration()
        let microphone = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
        if let audioInput = try? AVCaptureDeviceInput(device: microphone), self.captureSession.canAddInput(audioInput) {
            self.captureSession.addInput(audioInput)
        }
        self.captureSession.commitConfiguration()
    }
}

func removeAudioInput() {
    self.sessionQueue.async { [unowned self] in
        if let audioInput = self.captureSession.inputs.first(where: { ($0 as? AVCaptureDeviceInput)?.device.deviceType == .builtInMicrophone }) as? AVCaptureDeviceInput {
            self.captureSession.beginConfiguration()
            self.captureSession.removeInput(audioInput)
            self.captureSession.commitConfiguration()
        }
    }
}
Dan Karbayev
  • 2,870
  • 2
  • 19
  • 28

2 Answers2

4

I feel dumb enough for not figuring out the solution in short time. So, the solution was to create two AVCaptureSessions - one for video capture (session 1) and one for audio capture (session 2). Session 1 is always running, so that it provides us preview layer. Session 2 is started only when recording starts and stopped immediately after. Both sessions have corresponding AVCaptureOutputs and both of them outputs data buffers to the same delegate which, in turn, uses AVAssetWriter to write those buffers to a video file. That's all.

Dan Karbayev
  • 2,870
  • 2
  • 19
  • 28
  • 1
    if you have code sample please update in the answer dude – Suresh Velusamy Sep 09 '17 at 16:56
  • Please share sample code how to do this. Because I am just little curious about how to make audio capture session. – Asif Bilal Mar 14 '19 at 13:49
  • I did exactly this. Create two capturesession vars at the top of the file, one for audio one for camera. On viewload setup both capture session input and outputs accordingly, but only start the camera session, on record video event, start the audio session. When you're done stop the session. edit: this is also the way you always just have a green dot in the nav bar on ios 14. Have one session always gave me a green, then turned yellow, then back to green. – blackops Oct 18 '20 at 18:55
0

I have created 2 sessions as suggested here one for video streaming and another for recording with audio. But my application is not receiving the Camera frames if it is start recording with 1080p camera resolution.

Any suggestions for me, to fix this.

Thanks in Advance. Sharif.