3

I'm trying to get last frame from video. Last frame, not last second (because I have very fast videos, one second can have different scenes).

I've written such code for testing:

private func getLastFrame(from item: AVPlayerItem) -> UIImage? {
    let imageGenerator = AVAssetImageGenerator(asset: item.asset)

    imageGenerator.requestedTimeToleranceAfter = kCMTimeZero
    imageGenerator.requestedTimeToleranceBefore = kCMTimeZero

    let composition = AVVideoComposition(propertiesOf: item.asset)
    let time = CMTimeMakeWithSeconds(item.asset.duration.seconds, composition.frameDuration.timescale)

    do {
        let cgImage = try imageGenerator.copyCGImage(at: time, actualTime: nil)
        return UIImage(cgImage: cgImage)
    } catch {
        print("\(error)")
        return nil
    }
}

But I receive always such error when try to execute it:

Domain=AVFoundationErrorDomain Code=-11832 "Cannot Open" UserInfo={NSUnderlyingError=0x170240180 {Error Domain=NSOSStatusErrorDomain Code=-12431 "(null)"}, NSLocalizedFailureReason=This media cannot be used., NSLocalizedDescription=Cannot Open}

If I remove requestedTimeTolerance (so it will be on default infinite value) everything is okay, but I always receive brighter imaged than I have in video (maybe it is because not latest frame was captured? Or CGImage → UIImage transform has some troubles?)

Questions:

  1. Why I receive error when zero tolerance is specified? How to get exactly last frame?
  2. Why captured images may be overbrighted that in video? For example if I write such code:

    self.videoLayer.removeFromSuperlayer()
    self.backgroundImageView.image = getLastFrame(from: playerItem)
    

I see "brightness jump" (video was darker, image is brighter).

Update 1

I found related issue: AVAssetImageGenerator fails at copying image, but that question is not solved.

Vasily
  • 3,740
  • 3
  • 27
  • 61

0 Answers0