10

I have made a video player that is analyzing the realtime audio and video tracks from the video that is currently playing. The videos are stored on the iOS device (in the Apps Documents directory).

This all works fine. I use MTAudioProcessingTap in order to get all the audio samples and do some FFT, and I am analyzing the video by just copy'ing the pixel buffers from the currently played CMTime (the AVPlayer currentTime property). As I said, this works fine.

But now I want to support Airplay. Just the airplay itself is not difficult, but my taps stop working as soon as Airplay is toggled and the video is playing on the ATV. Somehow, the MTAudioProcessingTap won't process and the pixelbuffers are all empty... I can't get to the data.

Is there any way to get to this data ?

In order to get the pixel buffers, I just fire an event every few milli-sec and retrieving the player's currentTime. Then:

CVPixelBufferRef imageBuffer = [videoOutput copyPixelBufferForItemTime:time itemTimeForDisplay:nil];
CVPixelBufferLockBaseAddress(imageBuffer,0);

uint8_t *tempAddress = (uint8_t *) CVPixelBufferGetBaseAddress(imageBuffer);

size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);

CVPixelBufferUnlockBaseAddress(imageBuffer,0);

Where tempAddress is my pixelbuffer, and videoOutput is an instance of AVPlayerItemVideoOutput.

For audio, I use:

AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];

// Create a processing tap for the input parameters
MTAudioProcessingTapCallbacks callbacks;

callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;

MTAudioProcessingTapRef tap;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
                                          kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
    NSLog(@"Unable to create the Audio Processing Tap");
    return;
}

inputParams.audioTapProcessor = tap;

// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = @[inputParams];
playerItem.audioMix = audioMix;

Regards, Niek

Niek van der Steen
  • 1,413
  • 11
  • 33

2 Answers2

0

Unfortunately, in my experience, it's not possible to get information about the audio/video during Airplay as the playback is being done on the Apple TV so the iOS device doesn't have any of the information.

I had the same issue with getting SMPTE subtitle data out of the timedMetaData, which stops getting reported during Airplay.

CMash
  • 1,987
  • 22
  • 35
-2

Here the solution:

this is to implement AirPlay, i use this code only for Audio on my app i don't know if you can improve for the video but you can try ;)

On AppDelegate.m :

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {

    [RADStyle applyStyle];
    [radiosound superclass];
    [self downloadZip];

    NSError *sessionError = nil;
    [[AVAudioSession sharedInstance] setDelegate:self];
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
    [[AVAudioSession sharedInstance] setActive:YES error:nil];

    UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
    AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(sessionCategory), &sessionCategory);

    UInt32 audioRouteOverride = kAudioSessionOverrideAudioRoute_Speaker;
    AudioSessionSetProperty (kAudioSessionProperty_OverrideAudioRoute,sizeof (audioRouteOverride),&audioRouteOverride);

    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
}

An if you use airplay in nice to implement the LockScreen control, ArtWork, Stop/play, Title ecc.

In the DetailViewController of you player use this code:

- (BOOL)canBecomeFirstResponder {

    return YES;
}
- (void)viewDidAppear:(BOOL)animated {

    [super viewDidAppear:animated];
    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
    [self becomeFirstResponder];

    NSData* imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString: (self.saved)[@"image"]]];

    if (imageData == nil){

    MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
    MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:@"lockScreen.png"]];

    infoCenter.nowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"web"],MPMediaItemPropertyArtist: saved[@"title"], MPMediaItemPropertyArtwork:albumArt};

    } else {

        MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter];
        MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:[UIImage imageWithData:imageData]];

        infoCenter.nowPlayingInfo = @{MPMediaItemPropertyTitle: saved[@"link"],MPMediaItemPropertyArtist: saved[@"title"], MPMediaItemPropertyArtwork:albumArt};

    }

}

Hope this code can help you ;)

BlackSheep
  • 1,087
  • 12
  • 29