I'm using the following code to concatenate multiple AVURLAssets
:
AVMutableComposition * movie = [AVMutableComposition composition];
CMTime offset = kCMTimeZero;
for (AVURLAsset * asset in assets) {
AVMutableCompositionTrack *compositionVideoTrack = [movie addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [movie addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
AVAssetTrack *assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject;
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
NSError * error = nil;
if (![compositionVideoTrack insertTimeRange: timeRange ofTrack: assetVideoTrack atTime: offset error: &error]) {
NSLog(@"Error adding video track - %@", error);
}
if (![compositionAudioTrack insertTimeRange: timeRange ofTrack: assetAudioTrack atTime: offset error: &error]) {
NSLog(@"Error adding audio track - %@", error);
}
offset = CMTimeAdd(offset, asset.duration);
}
The resultant composition plays through to the combined duration of all the original assets, and the audio plays correctly, but only the video from the first asset plays, then pauses on its final frame.
Any thoughts on what I've done wrong?
The ordering of the original assets is irrelevant - the first video and all the audio plays.