I'm trying to take an AVCaptureSession and encode to mp4. It seems like this should be straightforward, and I'm trying to encode a single 960x540 video stream; I'm not worried about audio for the purpose of this issue.
When I run the following code and grab out2.mp4
out of the documents container with Xcode, I get a black screen in quicktime and the duration is 46 hours long. At least the resolution looks right. Here's output from ffmpeg -i out2.mp4
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'out2.mp4':
Metadata:
major_brand : mp42
minor_version : 1
compatible_brands: mp41mp42isom
creation_time : 2015-11-18 01:25:55
Duration: 46:43:04.21, start: 168178.671667, bitrate: 0 kb/s
Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt709/bt709), 960x540, 1860 kb/s, 27.65 fps, 29.97 tbr, 600 tbn, 1200 tbc (default)
Metadata:
creation_time : 2015-11-18 01:25:55
handler_name : Core Media Video
Why can't I append sample buffers to the AVAssetWriterInput
in this scenario?
var videoInput: AVAssetWriterInput?
var assetWriter: AVAssetWriter?
override func viewDidLoad() {
super.viewDidLoad()
self.startStream()
NSTimer.scheduledTimerWithTimeInterval(5, target: self, selector: "swapSegment", userInfo: nil, repeats: false)
}
func swapSegment() {
assetWriter?.finishWritingWithCompletionHandler(){
print("File written")
}
videoInput = nil
}
func pathForOutput() -> String {
let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
if let documentDirectory: NSURL = urls.first {
let fileUrl = documentDirectory.URLByAppendingPathComponent("out1.mp4")
return fileUrl.path!
}
return ""
}
func startStream() {
assetWriter = try! AVAssetWriter(URL: NSURL(fileURLWithPath: self.pathForOutput()), fileType: AVFileTypeMPEG4)
let videoSettings: [String: AnyObject] = [AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: 960, AVVideoHeightKey: 540]
videoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: videoSettings)
videoInput!.expectsMediaDataInRealTime = true
assetWriter?.addInput(videoInput!)
assetWriter!.startWriting()
assetWriter!.startSessionAtSourceTime(kCMTimeZero)
let videoHelper = VideoHelper()
videoHelper.delegate = self
videoHelper.startSession()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef, fromConnection connection: AVCaptureConnection!) {
if let videoOutput = captureOutput as? AVCaptureVideoDataOutput {
videoInput?.appendSampleBuffer(sampleBuffer)
}
}