2

AVPlayerLooper accepts a template AVPlayerItem and a AVQueuePlayer as setup parameters, then it internally manipulates items of the queue and player is constantly changing its currentItem.

This works perfect with AVPlayerLayer, which accepts this looped player as parameter and just renders it, but how can I use it with AVPlayerItemVideoOutput, which is being attached to the AVPlayerItem, which the player has multiple inside it? How do I reproduce the same thing AVPlayerLayer does internally?

AVPlayerLooper setup example from docs

NSString *videoFile = [[NSBundle mainBundle] pathForResource:@"example" ofType:@"mov"];
NSURL *videoURL = [NSURL fileURLWithPath:videoFile];

_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [AVQueuePlayer queuePlayerWithItems:@[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
_playerLayer.frame = self.view.bounds;
[self.view.layer addSublayer:_playerLayer];
[_player play];

This is how AVPlayerItemVideoOutput supposed to be used

[item addOutput:_videoOutput];

The only workround I came up with is to observe for changes of the currentItem and each time deattach the video output from old item and attach it to new one, like in example below, but this apparently neutralizes the gapless playback which I'm trying to achieve.

- (void)observeValueForKeyPath:(NSString*)path
                      ofObject:(id)object
                        change:(NSDictionary*)change
                       context:(void*)context {
   if (context == currentItemContext) {
      AVPlayerItem* newItem = [change objectForKey:NSKeyValueChangeNewKey];
      AVPlayerItem* oldItem = [change objectForKey:NSKeyValueChangeOldKey];
      if(oldItem.status == AVPlayerItemStatusReadyToPlay) {
          [newItem removeOutput:_videoOutput];
      }
      if(newItem.status == AVPlayerItemStatusReadyToPlay) {
          [newItem addOutput:_videoOutput];
      }
      [self removeItemObservers:oldItem];
      [self addItemObservers:newItem];
   }
}

For more context, I'm trying to come up with a fix for flutter's video_player plugin https://github.com/flutter/flutter/issues/72878

Plugin's code can be found here https://github.com/flutter/plugins/blob/172338d02b177353bf517e5826cf6a25b5f0d887/packages/video_player/video_player/ios/Classes/FLTVideoPlayerPlugin.m

nt4f04und
  • 182
  • 1
  • 2
  • 21

1 Answers1

3

You can do this by subclassing AVQueuePlayer (yay OOP) and creating and adding AVPlayerItemVideoOutputs there, as needed. I've never seen multiple AVPlayerItemVideoOutputs before, but memory consumption seems reasonable and everything works.

@interface OutputtingQueuePlayer : AVQueuePlayer
@end

@implementation OutputtingQueuePlayer

- (void)insertItem:(AVPlayerItem *)item afterItem:(nullable AVPlayerItem *)afterItem;
{
    if (item.outputs.count == 0) {
        NSLog(@"Creating AVPlayerItemVideoOutput");
        AVPlayerItemVideoOutput *videoOutput = [[AVPlayerItemVideoOutput alloc] initWithOutputSettings:nil]; // or whatever
        [item addOutput:videoOutput];
    }
    [super insertItem:item afterItem:afterItem];
}
@end

The current output is accessed like so:

AVPlayerItemVideoOutput *videoOutput = _player.currentItem.outputs.firstObject;
CVPixelBufferRef pixelBuffer = [videoOutput copyPixelBufferForItemTime:_player.currentTime itemTimeForDisplay:nil];
// do something with pixelBuffer here
CVPixelBufferRelease(pixelBuffer);

and configuration becomes:

_playerItem = [AVPlayerItem playerItemWithURL:videoURL];
_player = [OutputtingQueuePlayer queuePlayerWithItems:@[_playerItem]];
_playerLooper = [AVPlayerLooper playerLooperWithPlayer:_player templateItem:_playerItem];
_playerLayer = [AVPlayerLayer playerLayerWithPlayer:_player];
[self.view.layer addSublayer:_playerLayer];
[_player play];
Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159