I asked this question : Take snapshot of a UIView except some buttons to know how to export a picture with UIViews on top of it.
Now, I would like to know how to export a video with UIView's on top of it. This is the code I am using:
func createFinalVideo(){
let composition = AVMutableComposition()
let vidAsset = AVURLAsset(URL: myURL, options: nil)
// get video track
let vtrack = vidAsset.tracksWithMediaType(AVMediaTypeVideo)
let videoTrack:AVAssetTrack = vtrack[0]
let vid_timerange = CMTimeRangeMake(kCMTimeZero, vidAsset.duration)
let audioTrack = vidAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
let compositionVideoTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
let compositionAudioTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID(kCMPersistentTrackID_Invalid))
do {
try compositionVideoTrack.insertTimeRange(vid_timerange, ofTrack: videoTrack, atTime: kCMTimeZero)
try compositionAudioTrack.insertTimeRange(vid_timerange, ofTrack: audioTrack, atTime: kCMTimeZero)
}
catch let error as NSError{
print("THERE SHOULD BE ERROR DIFFERENT THAN NIL")
print(error)
return;
}
compositionVideoTrack.preferredTransform = videoTrack.preferredTransform
// Watermark Effect
let contentLayer = CALayer()
contentLayer.addSublayer(self.myTextView.layer)
contentLayer.frame = CGRectMake(0, 0, self.view.bounds.width, self.view.bounds.height)
let titleLayer = CATextLayer()
titleLayer.string = "DO YOU HEAR THE PEOPLE SING?"
titleLayer.fontSize = 18
titleLayer.foregroundColor = UIColor.redColor().CGColor
titleLayer.alignmentMode = kCAAlignmentCenter
titleLayer.frame = CGRectMake(20, 10, self.view.bounds.width - 40, 20);
titleLayer.displayIfNeeded()
let parentlayer = CALayer()
parentlayer.frame = CGRectMake(0, 0, self.view.bounds.width, self.view.bounds.height)
parentlayer.addSublayer(self.playerLayer)
parentlayer.addSublayer(titleLayer)
self.view.layer.addSublayer(parentlayer)
let layercomposition = AVMutableVideoComposition()
layercomposition.frameDuration = CMTimeMake(1, 30)
layercomposition.renderSize = CGSize(width: self.view.bounds.width, height: self.view.bounds.height)
layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: playerLayer, inLayer: parentlayer)
// instruction for watermark
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)
let videotrack = composition.tracksWithMediaType(AVMediaTypeVideo)[0] as AVAssetTrack
let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)
instruction.layerInstructions = NSArray(object: layerinstruction) as! [AVVideoCompositionLayerInstruction]
layercomposition.instructions = NSArray(object: instruction) as! [AVVideoCompositionInstructionProtocol]
// create new file to receive data
let dirPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
let docsDir: AnyObject = dirPaths[0]
let movieFilePath = docsDir.stringByAppendingPathComponent("completeFinalMovie.mov")
let movieDestinationUrl = NSURL(fileURLWithPath: movieFilePath)
do{
try NSFileManager.defaultManager().removeItemAtPath(movieFilePath)
}
catch let error as NSError{
print(error)
return;
}
// use AVAssetExportSession to export video
let assetExport = AVAssetExportSession(asset: composition, presetName:AVAssetExportPresetHighestQuality)!
assetExport.outputFileType = AVFileTypeQuickTimeMovie
assetExport.outputURL = movieDestinationUrl
assetExport.exportAsynchronouslyWithCompletionHandler({
switch assetExport.status{
case AVAssetExportSessionStatus.Failed:
print("failed \(assetExport.error)")
case AVAssetExportSessionStatus.Cancelled:
print("cancelled \(assetExport.error)")
default:
print("Movie complete")
// play video
NSOperationQueue.mainQueue().addOperationWithBlock({ () -> Void in
// self.playVideo(movieDestinationUrl!)
UISaveVideoAtPathToSavedPhotosAlbum(movieFilePath, self,#selector(self.handleCompletionOfVideoToGallery), nil)
})
}
})
}
So, I am getting a crash at the end of this about firstResponder but this is not the important thing since this titleLayer I created was justing for testing.
My main question is about how UIViews and CALayer would behave to solve my problem. By what I understand, I have to have a parentLayer, the videoLayer (in which in this case is my playerLayer, an AVPlayerLayer that is currently playing the video) and a contentLayer which is the layer that should contain the UIViews currently on top of the video ( like Snapchat's emojis and texts on top of the recorded video). If I just do contentLayer.addSubLayer(self.emoji.layer) or contentLayer.addSubLayer(self.myTextView.layer)
I get the recorded video saved in my gallery but nothing on top of it ( The same as if you recorded a Snapchat video and put emojis, text and whatsoever but in trying to save the video to your device's gallery, the final saved video is the same as the recorded video, without any emojis and text on top of it). So, any ideas, please?
I know both Swift and Objective-C, so feel free to post code with any of them.