I'm developing an iOS Obj-C app that uses the GoogleCast Framework.
I am trying to get the lock screen controls working via MPNowPlayingInfoCenter
I have added the correct permissions to my app to support backgrounding.
I am initialising the cast device like so:
self.chromeCastDeviceManager = [[GCKDeviceManager alloc] initWithDevice:self.selectedChromeCastDevice clientPackageName:[info objectForKey:@"CFBundleIdentifier"] ignoreAppStateNotifications:YES];
When I launch the media on the receiver, I try to init the MPNowPlayingInfoCenter
when I get a successful playback response.
NSInteger result = [_castControlChannel play];
if (!result) {
//launch has worked
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
Class playingInfoCenter = NSClassFromString(@"MPNowPlayingInfoCenter");
if (playingInfoCenter) {
NSDictionary *songInfo = [NSDictionary dictionaryWithObjectsAndKeys:
@"Test artist", MPMediaItemPropertyArtist,
@"Test title", MPMediaItemPropertyTitle,
@"Test Album", MPMediaItemPropertyAlbumTitle,
nil];
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:songInfo];
}
}
I'm starting to believe that MPNowPlayingInfoCenter
only provides controls for media sessions when they're loaded in via AVKit/ AVFoundation
? Am I correct in saying this?
If that is true, I could probably create an AVKit/ AVFoundation
a NSInteger long with the play method
on the Chromecast. Then force the AVPlayer to be hidden and silent. I can then catch the media controls and redirect to the Chromecast?
I assume the Youtube app on iOS does it this way? Both Netflix and Spotify both don't support the lockscreen controls when casting to a chromecast.