1

I have two camera feeds coming into an OSX app and I am trying to save them using AVCaptureMovieFileOutput. It doesn't take long before they videos are out of sync. After a min test they can be off by 1 to 5 seconds. After an hour test they are off by 20s. I feel there must be some sort of simple solution to keeping both outputs in sync. We have tried using the same device for both sessions and outputs and we get the same issue. We tried forcing the fps down to 15 and still no luck.

Setting Outputs

func assignDeviceToPreview(captureSession: AVCaptureSession, device: AVCaptureDevice, previewView: NSView, index: Int){

    captureSession.stopRunning()

    captureSession.beginConfiguration()

    //clearing out old inputs
    for input in captureSession.inputs {
        let i = input as! AVCaptureInput
        captureSession.removeInput(i)
    }

    let output = self.outputs[index]
    output.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]

    //removing old outputs
    for o in captureSession.outputs{
        if let oc = o as? AVCaptureStillImageOutput{
            captureSession.removeOutput(oc)
            print("removed image out")
        }
    }

    //Adding input
    do {

        try captureSession.addInput(AVCaptureDeviceInput(device:device))

        let camViewLayer = previewView.layer!
        camViewLayer.backgroundColor = CGColorGetConstantColor(kCGColorBlack)

        let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer.frame = camViewLayer.bounds
        previewLayer.autoresizingMask = [.LayerWidthSizable, .LayerHeightSizable]

        camViewLayer.addSublayer(previewLayer)

        let overlayPreview = overlayPreviews[index]
        overlayPreview.frame.origin = CGPoint.zero

        previewView.addSubview(overlayPreview)

        //adding output
        captureSession.addOutput(output)

        if captureSession == session2{
            let audio = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)

            do {
                let input = try AVCaptureDeviceInput(device: audio)
                captureSession.addInput(input)
            }
        }

    } catch {
        print("Failed to add webcam as AV input")
    }

    captureSession.commitConfiguration()
    captureSession.startRunning()
}

Start Recording

func startRecording(){

    startRecordingTimer()

    let base = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0]
    let appFolder = "Sessions"
    let sessionFolder = "session_" + session.UUID

    let path = base+"/"+appFolder+"/"+sessionFolder

    do{
        try NSFileManager.defaultManager().createDirectoryAtPath(path, withIntermediateDirectories: true, attributes: nil)
    }catch{
        print("issue creating folder")
    }

    for fileOutput in fileOutputs{

        let fileName = "cam\(String(fileOutputs.indexOf(fileOutput)!))" + ".mov"

        let fileURL = NSURL.fileURLWithPathComponents([path, fileName])
        fileURLs.append(fileURL!)
        print(fileURL?.absoluteString)

        var captureConnection = fileOutput.connections.first as? AVCaptureConnection
        captureConnection!.videoMinFrameDuration = CMTimeMake(1, 15)
        captureConnection!.videoMaxFrameDuration = CMTimeMake(1, 15)

        if fileOutput == movieFileOutput1{
            fileOutput.setOutputSettings([AVVideoScalingModeKey: AVVideoScalingModeResize, AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 1280, AVVideoHeightKey: 720], forConnection: captureConnection)
        }else{
            fileOutput.setOutputSettings([AVVideoScalingModeKey: AVVideoScalingModeResizeAspect, AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 640, AVVideoHeightKey: 360], forConnection: captureConnection)
        }
        captureConnection = fileOutput.connections.first as? AVCaptureConnection
        print(fileOutput.outputSettingsForConnection(captureConnection))

        fileOutput.startRecordingToOutputFileURL(fileURL, recordingDelegate: self)

        print("start recording")
    }

}
Skyler Lauren
  • 3,792
  • 3
  • 18
  • 30

1 Answers1

2

For precise timing control, I think you'll need to look into using the lower level AVAssetWriter framework. This allows you to control the writing and timing of individual frames.

Using AVAssetWriter.startSession(atSourceTime: CMTime) you can precisely control when recording begins for each camera.

During the writing process, using an AVCaptureVideoDataOutputSampleBufferDelegate, you can further manipulate the CMSampleBuffer that's generated to adjust its timing info and further keep the two videos in sync. Look at https://developer.apple.com/reference/coremedia/1669345-cmsamplebuffer for references on adjusting the timing portion of a CMSampleBuffer.

That said, I've never tried this and it's not a certainty this will work but I'm confident you'll get close to what you're trying to achieve if you go down this path.

Tim Bull
  • 2,375
  • 21
  • 25
  • Thanks. I will look into this and get back to you. – Skyler Lauren Sep 19 '16 at 13:12
  • 1
    Thank you so much. Took a lot of Googleing and trial and error but I did get it to a point where I can manual control where I write an individual frame and based that calculation on real time instead of what the buffer thinks is correct. 1 hr 23 min and 58 seconds for both videos. Now off to figure out the audio portion. – Skyler Lauren Sep 27 '16 at 18:48
  • Great, glad that helped! – Tim Bull Sep 28 '16 at 18:11