1

I am trying to understand the functioning of CMTime and fps in a video file. I am trying to present each and every frame of a video in an AVPlayer using a for loop. I know this task can be easily done using the play method of AVPlayer. But I want to know how exactly frames are shown. I have created a for loop and trying to present each frame one by one by constantly updating the seekToTime method of AVPlayer. I am able to develop a solution but its not showing all the frames and the video looks jerky.

This is my code:

for(float t=0; t < asset.duration.value; t++)
{
    CMTime tt = CMTimeMake(t, asset.duration.timescale);
    [self.player seekToTime:tt];
    NSLog(@"%lld, %d",tt.value, tt.timescale);
}

Here, player is the instance of AVPlayer, asset is video asset whose frames I am trying to present. I have also tried using CMTimeMakeSeconds(t, asset.duration.timescale), but didn't work.

Please give your suggestion. Thank you.

maven25
  • 231
  • 2
  • 12

1 Answers1

0

Frames per second is generally not a constant in a video file - frames happen when they happen. If you want to find out when that is you can inspect the frames' presentation timestamps using the AVAssetReader and AVAssetReaderTrackOutput classes:

let reader = try! AVAssetReader(asset: asset)

let videoTrack = asset.tracksWithMediaType(AVMediaTypeVideo)[0]
let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings:nil)

reader.addOutput(trackReaderOutput)
reader.startReading()

while let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
    // pts shows when this frame is presented
    // relative to the start of the video file
    let pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
}

In your code, you are incorrectly sampling the video timeline. Here's how you would sample it at 30 fps (as mentioned above, this will probably not correspond to actual frame boundaries):

for (CMTime t = kCMTimeZero; CMTimeCompare(t, asset.duration) < 0;  t = CMTimeAdd(t, CMTimeMake(1, 30))) {
    [player seekToTime:t];
}
Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • I am trying to create a smooth video scrubber like the one in iOS Photos app. I think AVAssetReader should do my work. Can you suggest some resources from where I can start reading about AVAsstReader? The only resource I can find is on Apple's website about processing individual frames when capturing video. I am looking for getting individual frames for videos selected from camera roll. Thanks. – maven25 Sep 25 '16 at 19:40