I am currently working on a project in which our application receives JPEG-data from an IP Camera through TCP (using Network -NW- framework) and builds CMSampleBuffers from it. After this process, we are displaying it with AVSampleBufferDisplayerLayer simultaneously. And if a user clicks recording button, application starts recording with AVAssetWriter.
Prior to iOS 14, we were able to record approximately 60 frames a second. But now when we start to provide CMSampleBuffers to AVAssetWriterInput; we only receive 30-40 frames per second through our TCP Connection. Somehow, starting recording with AVAssetWriter drops number of packages we receive through sockets. We don’t experience such a problem prior to iOS 14. And before/after recording, number of frame we retrieve stabilizes around 60 fps.
Any idea what may cause this?
Thanks for your help in advance!
Part we receive data through TCP:
func setupReceive(_ connection: NWConnection) {
connection.receive(minimumIncompleteLength: 1, maximumLength: 65536) { (data, contentContext, isComplete, error) in
if let data = data, !data.isEmpty {
self.status = "did receive \(data.count) bytes"
MyIPCameraManager.shared.encodeData(data)
}
if isComplete {
self.stop(status: "EOF")
} else if let error = error {
self.connectionDidFail(error)
} else {
self.setupReceive(connection)
}
}
}
Part we initialize AVAssetWriter:
assetWriter = try? AVAssetWriter(outputURL: fileUrl, fileType: AVFileType.mp4)
if (assetWriter == nil) {
throw MyRecordingError.WriterInitializationError
}
var formatDesc: CMVideoFormatDescription? = nil
CMVideoFormatDescriptionCreate(allocator: kCFAllocatorDefault,
codecType: kCMVideoCodecType_JPEG,
width: 640,
height: 480,
extensions: nil,
formatDescriptionOut: &formatDesc)
self.assetWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: nil, sourceFormatHint: formatDesc)
self.assetWriterInput.expectsMediaDataInRealTime = true
self.assetWriter.add(assetWriterInput)