9

I'm trying to build an app which will capture frames from the camera and process them with OpenCV before saving those files to the device, but at a specific frame rate.

What I'm stuck on at the moment is the fact that AVCaptureVideoDataOutputSampleBufferDelegate doesn't appear to respect the AVCaptureDevice.activeVideoMinFrameDuration, or AVCaptureDevice.activeVideoMaxFrameDuration settings.

captureOutput runs far quicker than 2 frames per second as the above settings would indicate.

Do you happen to know how one could achieve this, with or without the delegate?

ViewController:

override func viewDidLoad() {
    super.viewDidLoad()

}

override func viewDidAppear(animated: Bool) {
    setupCaptureSession()
}

func setupCaptureSession() {

    let session : AVCaptureSession = AVCaptureSession()
    session.sessionPreset = AVCaptureSessionPreset1280x720

    let videoDevices : [AVCaptureDevice] = AVCaptureDevice.devices() as! [AVCaptureDevice]

    for device in videoDevices {
        if device.position == AVCaptureDevicePosition.Back {
            let captureDevice : AVCaptureDevice = device

            do {
                try captureDevice.lockForConfiguration()
                captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
                captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
                captureDevice.unlockForConfiguration()

                let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

                if session.canAddInput(input) {
                    try session.addInput(input)
                }

                let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

                let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
                output.setSampleBufferDelegate(self, queue: dispatch_queue)

                session.addOutput(output)

                session.startRunning()

                let previewLayer = AVCaptureVideoPreviewLayer(session: session)
                previewLayer.connection.videoOrientation = .LandscapeRight

                let previewBounds : CGRect = CGRectMake(0,0,self.view.frame.width/2,self.view.frame.height+20)
                previewLayer.backgroundColor = UIColor.blackColor().CGColor
                previewLayer.frame = previewBounds
                previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
                self.imageView.layer.addSublayer(previewLayer)

                self.previewMat.frame = CGRectMake(previewBounds.width, 0, previewBounds.width, previewBounds.height)

            } catch _ {

            }
            break
        }
    }

}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    self.wrapper.processBuffer(self.getUiImageFromBuffer(sampleBuffer), self.previewMat)
}
Richard Poole
  • 591
  • 1
  • 5
  • 21

1 Answers1

15

So I've figured out the problem.

In the comments section for AVCaptureDevice.h above the activeVideoMinFrameDuration property it states:

On iOS, the receiver's activeVideoMinFrameDuration resets to its default value under the following conditions:

  • The receiver's activeFormat changes
  • The receiver's AVCaptureDeviceInput's session's sessionPreset changes
  • The receiver's AVCaptureDeviceInput is added to a session

The last bullet point was causing my problem, so doing the following solved the problem for me:

        do {

            let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

            if session.canAddInput(input) {
                try session.addInput(input)
            }

            try captureDevice.lockForConfiguration()
            captureDevice.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 2)
            captureDevice.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: 2)
            captureDevice.unlockForConfiguration()

            let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

            let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
            output.setSampleBufferDelegate(self, queue: dispatch_queue)

            session.addOutput(output)
Edward Brey
  • 40,302
  • 20
  • 199
  • 253
Richard Poole
  • 591
  • 1
  • 5
  • 21
  • 1
    The KEY part here is the comment about when the value RESETS! I too was getting bit because I was setting frame rate BEFORE adding the device to a session! – EricWasTaken Oct 21 '17 at 18:05