0

I've been trying to get my videos mirrored for a while now but I cannot seem to fix it. I've tried the following question as well.

 if let connection = movieFileOutput.connection(with: .video) {
        if connection.isVideoMirroringSupported {
          connection.isVideoMirrored = self.videoDeviceInput?.device.position == .front
        }
    }

 if let connection = videoDataOutput.connection(with: .video) {
      if connection.isVideoMirroringSupported {
        connection.isVideoMirrored = self.videoDeviceInput?.device.position == .front
      }
    }

The video gets mirrored in the app but when downloading the video to my Photos app it's unmirrored again. How do I tackle this problem?

How my overall code looks:

  1. automaticallyConfiguresApplicationAudioSession is set to false.
  2. Configure the camera by getting device input and adding it to the AVCaptureSession
  3. Setting up the movie output like setting bitrate and for iPads configuring orientation. Here I should also set the .isVideoMirrored property.
  4. Setting the session preset.

Addition of audio session setup:

    guard let audioDevice = AVCaptureDevice.default(
      .builtInMicrophone,
      for: .audio,
      position: .unspecified
    ) else { return }

    guard let audioInput = try? AVCaptureDeviceInput(device: audioDevice),
      captureSession.canAddInput(audioInput) else { return }

    guard captureSession.inputs.contains(audioInput) == false else { return }

    captureSession.addInput(audioInput)
Frank
  • 99
  • 3
  • 14

1 Answers1

1

I used this workaround to solve this situation, check it out it possible that you can find it useful:

extension CameraViewController: AVCaptureFileOutputRecordingDelegate {
    func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
        if error != nil {
            print("Error recording movie: \(error!.localizedDescription)")
        } else {
            processMovie()
        }
    }

    func processMovie() {
        let asset = AVAsset(url: CameraViewController.movieURL)
        let composition = AVMutableComposition()
        let assetVideoTrack = asset.tracks(withMediaType: .video).last!
        let compositionVideoTrack = composition.addMutableTrack(withMediaType: AVMediaType.video,
                                                                preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
        try? compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero, duration: asset.duration),
                                                    of: assetVideoTrack,
                                                    at: CMTime.zero)
        if self.currentCameraPosition == .rear {
            compositionVideoTrack?.preferredTransform = assetVideoTrack.preferredTransform
        }
        if self.currentCameraPosition == .front {
            compositionVideoTrack?.preferredTransform = CGAffineTransform(scaleX: -1.0, y: 1.0).rotated(by: CGFloat(Double.pi/2))
        }

        if let exporter = AVAssetExportSession(asset: composition,
                                               presetName: AVAssetExportPresetHighestQuality) {
            exporter.outputURL = CameraViewController.exportMovieURL
            exporter.outputFileType = AVFileType.mov
            exporter.shouldOptimizeForNetworkUse = true
            exporter.exportAsynchronously() {
                DispatchQueue.main.async {
                    self.performSegue(withIdentifier: "ShowVideo", sender: nil)
                }
            }
        }
    }
}
Diego Jiménez
  • 1,398
  • 1
  • 15
  • 26
  • 1
    Thanks! I will investigate this with our project. Marking this as the answer since I feel like this will provide me in the right direction. – Frank Jan 20 '22 at 13:46