1

I am came to the conclusion that the video has to be rendered as a texture on a rectangle.

I came to those steps in order to achieve that:

  1. I need to take current frame from video:
    • I've used AVPlayerItemVideoOutput and then call copyPixelBufferForItemTime

Here is a simple ViewController that I've used to work on this ( I can see that is working, although imageview is not refreshed - I think I'm not using it correctly).

class ViewController: UIViewController {

var player: AVPlayer!
var playerItem: AVPlayerItem!
var output: AVPlayerItemVideoOutput!
var displayLink : CADisplayLink!
var context: CIContext!

@IBOutlet weak var image: UIImageView!
override func viewDidLoad() {
    super.viewDidLoad()

    context = CIContext()
    let asset = AVURLAsset( URL:NSURL(string: "http://vevoplaylist-live.hls.adaptive.level3.net/vevo/ch1/appleman.m3u8")!, options:nil)
    asset.loadValuesAsynchronouslyForKeys(["tracks"]){

        var error: NSError?
        let status = asset.statusOfValueForKey("tracks", error: &error)

        if (status == AVKeyValueStatus.Loaded)
        {
            let settings: [String: AnyObject] = [kCVPixelBufferPixelFormatTypeKey as String : Int(kCVPixelFormatType_32BGRA)]
            let output = AVPlayerItemVideoOutput(pixelBufferAttributes: settings)
            let playerItem = AVPlayerItem(asset: asset)
            playerItem.addOutput(output)
            let  player = AVPlayer(playerItem: playerItem)

            self.player = player
            self.playerItem = playerItem
            self.output = output

            self.displayLink = CADisplayLink(target: self, selector: Selector("handleTimer"))
            self.displayLink.frameInterval = 1
            self.displayLink.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSRunLoopCommonModes)
        }
        else
        {
            print("%@ Failed to load the tracks.")
        }
    }

}
func handleTimer() {
    let buffer = output.copyPixelBufferForItemTime(playerItem.currentTime(), itemTimeForDisplay:nil)
    if let bufferUnwrapped = buffer{
        let image = CIImage(CVPixelBuffer: bufferUnwrapped)
        let cgiImage = context.createCGImage(image, fromRect: image.extent)
        dispatch_async(dispatch_get_main_queue()){
            self.image.image = UIImage(CGImage: cgiImage)
            self.image.setNeedsDisplay()
        }
    }
}

}

  1. the second step would be to use the pixelBuffer to generate a texture.
    This step involves a lot of code and the best samples that I have found are from raywenderlich http://www.raywenderlich.com/93997/ios-8-metal-tutorial-swift-part-3-adding-texture (for swift 1.2) and the code sample from Waren Moore https://github.com/warrenm/slug-swift-metal/tree/master/SwiftMetalDemo

I didn't find another way to do this (display a live streamed video on a rectangle in Metal). Is there a better way to do this?

Laura Calinoiu
  • 704
  • 1
  • 8
  • 24

0 Answers0