I have an app that needs to render frames from a video/movie into a CGBitmapContext with an arbitrary CGAffineTransform. I'd like it to have a decent frame rate, like 20fps at least.
I've tried using AVURLAsset and [AVAssetImageGenerator copyCGImageAtTime:], and as the documentation for this method clearly states, it's quite slow, taking me down to 5fps sometimes.
What is a better way to do this? I'm THINKING that I could set up an AVPlayer with an AVPlayerLayer, then use [CGLayer renderInContext:] with my transform. Would this work? Or perhaps does a AVPlayerLayer not run when it notices that it's not being shown on the screen?
Any other ways to suggest?